Re: [AMBER] Is SHAKE necessary in running an MD?V

From: Adrian Roitberg <roitberg.ufl.edu>
Date: Tue, 02 Apr 2013 08:48:48 -0400

Alan,

Even it is 'solves' the problem, it really does not.

TIPnP models ARE rigid. The parameters and warranty only hold under
rigid conditions. SO, if you simulate them with a smaller dt, they do
not blow up, but it IS NOT TIP4P anymore.

So, careful with analyzing the results!

adrian

On 4/2/13 8:23 AM, Alan wrote:
> Changing dt does solve the problem. Thank you very much.
>
>
> At 2013-04-02 15:30:34,"Gerald Monard" <Gerald.Monard.univ-lorraine.fr> wrote:
>> Hi,
>>
>> From your input file, you use "dt=0.002", which is too large for a
>> non-SHAKE MD. Without SHAKE, you should use dt=0.001 at most (a better
>> set would be dt=0.0005).
>>
>> Gerald.
>>
>> On 04/02/2013 08:12 AM, Alan wrote:
>>> On 04/01/2013 06:08 PM, Gerald Monard wrote:
>>>
>>> Hi,
>>>
>>> As far as I know, TIP4P has been designed as a rigid water model.
>>> However, it should work in AMBER as a flexible model (parameters are
>>> available in frcmod.tip4p). I quickly try a 64 tip4p water system using
>>> either ntc=1 or ntc=2. It works.
>>> What is your input file (mdin)?
>>>
>>> Gerald.
>>>
>>> On 04/01/2013 10:29 AM, Alan wrote:
>>>
>>> Dear all:
>>>
>>>
>>> I'm running an MD of pure tip4p water. The calculation is not very big, so I thought if I could run it without SHAKE. But it turned out I couldn't. Every time I got similar error messages like this:
>>> [cli_0]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>>> SANDER BOMB in subroutine nonbond_list
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> [cli_3]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 3
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>> [cli_4]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 4
>>> [cli_1]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
>>> [cli_2]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
>>> list grid memory needs to be reallocated, restart sander
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>> [cli_6]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 6
>>> list grid memory needs to be reallocated, restart sander
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>> [cli_7]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 7
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>> [cli_5]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 5
>>> rank 3 in job 1 dell13_38141 caused collective abort of all ranks
>>> exit status of rank 3: return code 1
>>> rank 2 in job 1 dell13_38141 caused collective abort of all ranks
>>> exit status of rank 2: return code 1
>>> rank 1 in job 1 dell13_38141 caused collective abort of all ranks
>>> exit status of rank 1: return code 1
>>> rank 0 in job 1 dell13_38141 caused collective abort of all ranks
>>> exit status of rank 0: return code 1
>>>
>>>
>>> Unit 30 Error on OPEN: density.rst
>>> [cli_0]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>>> rank 0 in job 1 dell13_32985 caused collective abort of all ranks
>>> exit status of rank 0: killed by signal 9
>>>
>>>
>>> I checked the mailing list but no luck. I've ran similar MD before, the only difference is this time I removed SHAKE. So I added SHAKE back. It worked.
>>> So I'm thinking is SHAKE necessary in running an MD?
>>>
>>>
>>> Thank you.
>>> _______________________________________________
>>> AMBER mailing list
>>> AMBER.ambermd.orghttp://lists.ambermd.org/mailman/listinfo/amber
>>> I tried with heat and density. Without SHAKE, running heat with both sander and sander.MPI will produce a lot of vlimit exceeded for step and finally
>>> SANDER BOMB in subroutine nonbond_list
>>> volume of ucell too big, too many subcells
>>> list grid memory needs to be reallocated, restart sander
>>>
>>> running density with sander will produce similar messages. With sander.MPI will produce the message I posted.
>>>
>>> This is the heat.in file that didn't work.
>>> waterbox
>>> &cntrl
>>> imin=0,irest=0,ntx=1,
>>> nstlim=25000,dt=0.002,
>>> cut=8.0, ntb=1,
>>> ntpr=500, ntwx=500,
>>> ntt=3, gamma_ln=2.0,
>>> tempi=0.0, temp0=300.0,
>>> nmropt=1
>>> /
>>> &wt TYPE='TEMP0', istep1=0, istep2=25000,
>>> value1=0.1, value2=300.0, /
>>> &wt TYPE='END' /
>>>
>>> This is the density.in.
>>> density
>>> &cntrl
>>> imin=0,irest=1,ntx=5,
>>> nstlim=25000,dt=0.002,
>>> cut=10.0,ntb=2,ntp=1,taup=1,
>>> ntt=3,gamma_ln=2.0,
>>> temp0=300.0,
>>> ntpr=500,ntwx=500,ntwr=3000,
>>> iwrap=1
>>> /
>>>
>>> When added ntc=2,ntf=2, they will work.
>>> _______________________________________________
>>> AMBER mailing list
>>> AMBER.ambermd.org
>>> http://lists.ambermd.org/mailman/listinfo/amber
>>>
>> --
>> ____________________________________________________________________________
>>
>> Prof. Gerald MONARD
>> Theoretical Chemistry and Biochemistry Group
>> SRSMC, Universit¨¦ de Lorraine, CNRS
>> Boulevard des Aiguillettes B.P. 70239
>> F-54506 Vandoeuvre-les-Nancy, FRANCE
>>
>> e-mail : Gerald.Monard.univ-lorraine.fr
>> tel. : +33 (0)383.684.381
>> fax : +33 (0)383.684.371
>> web : http://www.monard.info
>>
>> ____________________________________________________________________________
>>
>>
>> _______________________________________________
>> AMBER mailing list
>> AMBER.ambermd.org
>> http://lists.ambermd.org/mailman/listinfo/amber
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber

-- 
                            Dr. Adrian E. Roitberg
                                  Professor
                Quantum Theory Project, Department of Chemistry
                            University of Florida
                              roitberg.ufl.edu
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue Apr 02 2013 - 06:00:03 PDT
Custom Search