Re: [AMBER] Problems with implicit solvent

From: Carlos Simmerling <carlos.simmerling.gmail.com>
Date: Fri, 5 Dec 2014 06:53:08 -0500

Dear Carlo,
I can't answer questions about possible bugs in the CUDA code, but you
could certainly do test runs in CPUs or even better, using sander. See if
the energies match at the first steps. The same would go for you "input
being ignored" - the energies would not change in that case. Also, I
wonder about the choice of GB model, are you following a successful example
in the literature where igb=7 and bondi radii was used to study peptides in
low dielectric solvent? If not, then you should expect to have to do some
validation against experiment and detailed testing before you could apply
it to production work. I am not aware of work that successfully used this
gb model under the conditions that you are using. I do know, for example,
that this combination does not give results in water comparable to explicit
water - see our development of the igb=8 model in Nguyen et al JCTC 2013
and Nguyen et al JACS 2014.
Hope that helps
Carlos
On Dec 5, 2014 6:12 AM, "Carlo Guardiani" <carlo.guardiani.dsf.unica.it>
wrote:

> Dear Amber developers,
> I am performing T-REMD simulations in implicit solvent (igb=7) of a
> peptide using the CUDA version of Amber14 and the ff14SB force field. I
> first performed a simulation using the default values of the internal and
> external dielectric constants and I attained a certain secondary structure
> profile. I then tried to repeat the simulation in a more hydrophobic
> solvent lowering the extdiel parameter but I attained exactly the same
> profile of secondary structure so that I believe that my command was
> simply ignored. Here are the input files of the two simulations:
>
> =======================================
> WATER SIMULATION MDIN
> =======================================
>
> REMD
> &cntrl
> irest=0, ntx=1,
> nstlim=1250, dt=0.002,
> ntt=3, gamma_ln=1.0,
> temp0=300.00000000000000, ig=29600,
> ntc=2, ntf=2, nscm=1000,
> ntb=0, igb=7, saltcon=0.01,
> cut=999.0, rgbmax=999.0,
> ntpr=1250, ntwx=1250, ntwr=1250,
> numexchg=40000,
> /
>
> ==========================================
> HYDROPHOBIC SOLVENT SIMULATION MDIN
> ==========================================
>
> REMD
> &cntrl
> irest=0, ntx=1,
> nstlim=1250, dt=0.002,
> ntt=3, gamma_ln=1.0,
> temp0=300.00000000000000, ig=29600,
> ntc=2, ntf=2, nscm=1000,
> ntb=0, igb=7, saltcon=0.01,
> cut=999.0, rgbmax=999.0,
> ntpr=1250, ntwx=1250, ntwr=1250,
> numexchg=40000,
> intdiel=1.0, extdiel=26.7,
> /
>
> The simulation in hydrophobic solvent seems to correctly read the input
> file
> as shown by the content of the mdout files:
>
>
> --------------------------------------------------------------------------------
> 2. CONTROL DATA FOR THE RUN
>
> --------------------------------------------------------------------------------
>
> NMET
>
> General flags:
> imin = 0, nmropt = 0
>
> Replica exchange
> numexchg= 40000, rem= 1
>
> Nature and format of input:
> ntx = 1, irest = 0, ntrx = 1
>
> Nature and format of output:
> ntxo = 1, ntpr = 1250, ntrx = 1, ntwr =
> 1250
> iwrap = 0, ntwx = 1250, ntwv = 0, ntwe =
> 0
> ioutfm = 0, ntwprt = 0, idecomp = 0, rbornstat=
> 0
>
> Potential function:
> ntf = 2, ntb = 0, igb = 7, nsnb =
> 25
> ipol = 0, gbsa = 0, iesp = 0
> dielc = 1.00000, cut = 999.00000, intdiel = 1.00000
> saltcon = 0.01000, offset = 0.09000, gbalpha= 1.09511
> gbbeta = 1.90793, gbgamma = 2.50798, surften = 0.00500
> rdt = 0.00000, rgbmax = 999.00000 extdiel = 26.70000
> alpb = 0
>
>
> Following the suggestions on the manual, in both simulations the bondi
> radii were used:
>
> source leaprc.ff14SB
> set default PBradii bondi
> a=sequence {NMET CYS ILE PRO PRO SER TYR ALA ASP LEU GLY LYS ALA ALA ARG
> ASP ILE PHE ASN LYS GLY PHE GLY PHE CGLY }
> saveamberparm a vdac2.prmtop vdac2.inpcrd
> savepdb a vdac2_xleap.pdb
> quit
>
> The programs have been run with the script:
>
> #!/bin/bash
> source /etc/modules.sh
> module load intel openmpi amber14
> export CUDA_VISIBLE_DEVICES=0,1,2,3
> mpirun -np 20 pmemd.cuda.MPI -ng 20 -groupfile remd.groupfile
>
>
> Did I make any mistake or is there any bug with the force field or the
> Amber release I have used ?
>
> Thank you very much for your help and kind regards,
>
> Carlo Guardiani
>
>
>
>
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Dec 05 2014 - 04:00:03 PST
Custom Search