[AMBER] Differences in Lennard-Jones interactions in CPU vs GPU pmemd?

From: Hanne Antila <Hanne.Antila.mpikg.mpg.de>
Date: Mon, 30 Mar 2020 16:26:43 +0000

Dear experts,


I'm have a small nvt simulation of sugar-water solution which runs fine in cpu amber (>100ns) but always blows up in single-gpu amber (~1-10 ns, both amber16 and amber18 ). Looking at the output the culprit might be LJ interactions/an overlap of atoms:


 NSTEP = 690000 TIME(PS) = 6635.000 TEMP(K) = 297.51 PRESS = 0.0
 Etot = 10904.8650 EKtot = 2383.7571 EPtot = 8521.1079
 BOND = 6040.4102 ANGLE = 1384.2706 DIHED = 510.1503
 UB = 0.0000 IMP = 0.0000 CMAP = 0.0000
 1-4 NB = 368.8766 1-4 EEL = 13543.7115 VDWAALS = 3110.9255
 EELEC = -16437.2367 EHBOND = 0.0000 RESTRAINT = 0.0000
 ------------------------------------------------------------------------------

wrapping first mol.: -2603377.53130 23007489.83260 3909449.84430

 NSTEP = 695000 TIME(PS) = 6645.000 TEMP(K) = NaN PRESS = 0.0
 Etot = NaN EKtot = NaN EPtot = **************
 BOND = -0.0000 ANGLE = 153608.5525 DIHED = 824.9250
 UB = 0.0000 IMP = 0.0000 CMAP = 0.0000
 1-4 NB = 0.0000 1-4 EEL = 0.0010 VDWAALS = **************
 EELEC = -30673.5824 EHBOND = 0.0000 RESTRAINT = 0.0000



The force field defines some special 1-4 interactions. Reading the mailing list, it seems to imply that these used to be treated differently in cpu vs gpu. Is this still the case? Or are there any other possible reasons why LJ interactions would be different in the two cases?


Thank you for the help,


Hanne Antila

Postdoctoral scholar

Max Planck Institute of Colloids and Interfaces
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Mar 30 2020 - 09:30:03 PDT
Custom Search