Re: [AMBER] Fwd: NVE GPU question

From: Joseph Baker <bakerj.tcnj.edu>
Date: Thu, 22 Jan 2015 18:24:04 -0500

Hi all,

Jason, I did have a dt=0.0001 test and there the drift was 1.8 x 10^-4
kT/ns/dof (compared to 2.9 x 10^-4 and 3.6 x 10^-4 for dt=0.005 and
dt=0.001, respectively). So better, but not by leaps and bounds.

I am trying dsum_tol tests now, and will see how that goes. I have not
tried the PME order because I have been using pmemd.cuda, as you mention.

I haven't tried ntwf to check the forces yet, and I don't have CPU data for
this system. I will check that to see if the drift is better on the CPU.

Gerald, I've been computing the drift by using the regression tool in
xmgrace to fit a line to the Etot vs time data from the amber output file.
That slope I then I convert to kT/ns/dof by taking
slope/0.593*1000/(3*number of atoms) (the factor of 1000 is to convert to
ns since the time unit in the Etot vs time file is ps)

Thomas, thanks for the additional input on the grid size.

Joe



On Thu, Jan 22, 2015 at 5:58 PM, Thomas Cheatham <tec3.utah.edu> wrote:

>
> > ​Ah, ew_type=1 means pure Ewald (no PME). So rsum_tol doesn't do what I
> > thought it did... I guess dsum_tol is the setting to play with, although
> > it's not clear to me that that will improve overall accuracy.
> >
> > You could also try to set the PME order to 5 or so (I don't think that
> will
> > work with pmemd.cuda, but it should work with pmemd, and hopefully give
> > better accuracy in the reciprocal forces).
>
> ...and increase FFT grid size (in addition to order) for the
> reciprocal...
>
> --tec3
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Thu Jan 22 2015 - 15:30:03 PST
Custom Search