Re: [AMBER] pmemd.MPI won't work in Amber 18

From: Hai Long <spaceshiptoo.gmail.com>
Date: Wed, 3 Oct 2018 20:57:17 -0600

Hi David and Ross,

Thank you for your kindly replies and sorry I didn't update the status for
this problem. I had made it working by compiling using parallel MKL. But it
only works for one node. When running on multiple nodes it results in
segmentation fault. If I choose not to use MKL, I need to recompile FFTW3
by using F77=mpiifort, and again, it only works for one node. Any ideas on
this problem? Thanks!

Best

On Thu, May 24, 2018 at 4:08 PM Ross Walker <ross.rosswalker.co.uk> wrote:

> Hi Hai,
>
> I've seen this type of behavior when the mpirun command being used to run
> pmemd.MPI is different from the mpi installation that was used to build the
> executable.
>
> I would check that which mpif90 matches which mpirun. I'd then try
> rebuilding from scratch with your current environment and see if the
> problem persists.
>
> All the best
> Ross
>
> > On May 24, 2018, at 5:44 PM, Hai Long <spaceshiptoo.gmail.com> wrote:
> >
> > Hi David,
> >
> > It won't work with any pmemd tests in the test suite. The following is
> from
> > the 4096wat test output. The same thing happens. It seems to me that
> > pmemd.MPI somehow messed up electrostatic interactions. The vdwaals
> > (6028.9517) is identical with the sander output though.
> >
> >
> > NSTEP = 1 TIME(PS) = 1.001 TEMP(K) = NaN PRESS =
> > 0.0
> >
> > Etot = NaN EKtot = NaN EPtot =
> > NaN
> >
> > BOND = 0.0000 ANGLE = 0.0000 DIHED =
> > 0.0000
> >
> > 1-4 NB = 0.0000 1-4 EEL = 0.0000 VDWAALS =
> > 6028.9517
> >
> > EELEC = NaN EHBOND = 0.0000 RESTRAINT =
> > 0.0000
> >
> > Ewald error estimate: NaN
> >
> >
> > On Thu, May 24, 2018 at 5:38 AM, David A Case <david.case.rutgers.edu>
> > wrote:
> >
> >> On Wed, May 23, 2018, Hai Long wrote:
> >>>
> >>> We recently upgraded from Amber14 to Amber18. The compiler, MPI, and
> MKL
> >>> that I used for Amber 18 are all intel 2017.0.5. The sander,
> sander.MPI,
> >>> and serial pmemd are all working properly. However, pmemd.MPI produces
> >>> errors. The following output is from a simulation of a 5X5X5nm
> >> equilibrated
> >>> water box, showing something wrong with EELEC and VIRIAL:
> >>
> >> Thanks for the report. We would need to see the actual inputs, plus the
> >> command-line options you chose, in order to try to track down the
> >> problem. It looks like there is something wrong with your input
> >> coordinates.
> >>
> >> (I'm assuming that you have run the pmemd test suite in parallel, and
> >> that things look OK there. If you haven't done this, please do so,
> >> since that can remove the possibility of a bad input file.)
> >>
> >> ....dac
> >>
> >>
> >> _______________________________________________
> >> AMBER mailing list
> >> AMBER.ambermd.org
> >> http://lists.ambermd.org/mailman/listinfo/amber
> >>
> > _______________________________________________
> > AMBER mailing list
> > AMBER.ambermd.org
> > http://lists.ambermd.org/mailman/listinfo/amber
>
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Wed Oct 03 2018 - 20:00:02 PDT
Custom Search