All the calculations were performed on CPUs. I tested the script on my desktop machine on 16 CPUs and the simulation was running on both sander and pmemd (although it still had NaN in both cases, but it did not crash). When I was trying to run it on 112 cores on the supercomputer, only sander.MPI is working and for pmemd.MPI I get " NaN(s) found in the input coordinates” error and the simulation stops.
My system is composed of ~ 127k atoms. I am using GB_HCT and I am using map constraints. I briefly minimised the structure as well where I have obtained notification: RESTARTED DUE TO LINMIN FAILURE, but I read that this error is not really an error.
Technically, I just want to test performance of AMBER when using EMAP, because I would like to use SGLD which is also implemented in CHARMM and rather too slow. I am not sure whether it will be faster than CHARMM when using pmemd.MPI.
I do not have any non-standard parameters. It is highly possible that I am missing something in my script.
This is my input file:
&cntrl
imin=0, ntb=0,
igb=1, ntpr=500, ntwx=500,
ntt=3, gamma_ln=1.0,
tempi=50.0, temp0=300.0,
nstlim=500, dt=0.002,
ntc=1
cut=14.0,
/
&emap
mapfile='./EMAP/emd_8149.map',
atmask=':*&!.H=',
fcons=0.5,
move=1,
ifit=1,
mapfit='EMAP/emd_1.map',
molfit='pdb/emd_1.pdb', /
Thank you
Paula
> On 20 Jun 2019, at 15:52, Carlos Simmerling <carlos.simmerling.gmail.com> wrote:
>
> Can you tell us more about what you're simulating? Any nonstandard
> parameters? For implicit solvent, which model and which radii set?
>
> NaN is bad regardless of whether the simulation stops or not. When you say
> sander. MPI and pmemd.MPI work, which code did not work? Was it on cpu or
> gpu?
>
> On Thu, Jun 20, 2019, 9:35 AM Paula Mihaljevic-Juric <
> paula.mihaljevic-juric.polytechnique.edu> wrote:
>
>> Hi!
>>
>> I am trying to run MD simulation in implicit solvent.
>>
>> I have tested the script on my desktop computer where for some results I
>> would obtain NaN, but the simulations would not stop. Both sander.MPI and
>> pmemd.MPI worked fine.
>>
>> However, when I am trying to use the same script on a supercomputer, the
>> simulation crashes with the error NaN(s) found in the input coordinates
>> while using pmemd.MPI. The simulation does not crash when using sander.MPI.
>>
>> What am I doing wrong? How can I fix this?
>>
>> Thanks
>>
>> Paula
>> _______________________________________________
>> AMBER mailing list
>> AMBER.ambermd.org
>> http://lists.ambermd.org/mailman/listinfo/amber
>>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Thu Jun 20 2019 - 07:30:03 PDT