Re: [AMBER] amber18 parallel cpu make test hanging at first test

From: Ravi Abrol <raviabrol.gmail.com>
Date: Tue, 14 May 2019 17:55:22 -0700

Thanks Dave.
Those tests worked.
After that make test worked except for the following error, so I don't know
what made nab hang before. Anyway, with export DO_PARALLEL="mpirun -np 4",
the only parallel test that failed now is multirem and here is the message:
>>>
Running multisander version of sander Amber18
    Total processors = 4
    Number of groups = 4
 ===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 6527 RUNNING AT boltzmann
= EXIT CODE: 9
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Killed (signal 9)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
./Run.multirem: Program error
>>>

No other error messages are reported about this on stdout or in the 4 mdout
files, which
all end with:
Replica exchange
     numexchg= 100, rem= -1
Nature and format of input:

What can I do to resolve this? This test fails with same message even if I
use 8 or 16 procs.
Thanks,
Ravi


On Tue, May 14, 2019 at 6:02 AM David A Case <david.case.rutgers.edu> wrote:

> On Mon, May 13, 2019, Ravi Abrol wrote:
> >
> >I recently installed amber18 with ambertools19 and was testing its
> >installation.
> >
> >Serial make test worked with no errors.
> >
> >Parallel make test with DO_PARALLEL="mpirun -np 4" hangs at the first test
> >with the message:
> >
> >make[3]: Entering directory '/usr/local/amber18/AmberTools/test/nab'
> >Running test to do simple minimization
> >(this tests the molecular mechanics interface)
> >
> >How do I diagnose this?
>
> A couple of ideas:
>
> 1. see if the problem is general:
>
> export TESTsander=$AMBERHOME/bin/sander.MPI
> export DO_PARALLEL='mpirun -np 4'
> cd $AMBERHOME/test/dhfr
> ./Run.dhfr
>
> That will help see if the problem is specific to mpinab, or whether it
> is more generally related to MPI.
>
> 2. If the above works,
>
> cd $AMBERHOME/AmberTools/test/nab
> ./Run.sff
>
> Examine any output files (like ltest.out) or any error messages you
> might find.
>
> ...good luck...dac
>
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue May 14 2019 - 18:00:02 PDT
Custom Search