Re: [AMBER] MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.

From: Jason Swails <jason.swails.gmail.com>
Date: Mon, 7 Mar 2011 09:07:43 -0800

Hello,

There is no apparent part of your calculation that uses MPI for anything, so
it should be impossible to get an MPI_ABORT. The only way this can happen
is if MMPBSA.py.MPI was renamed MMPBSA.py or if sander.MPI (or
sander.LES.MPI) was renamed sander. If this is the case, recompile Amber in
serial so that it uses the correct (serial, non-LES) version of sander.

Good luck!
Jason

On Mon, Mar 7, 2011 at 5:33 AM, hari krishna <haricoolguy111.gmail.com>wrote:

> Please guess something to resolve from this problem
>
>
> I am using the following command to rum MMPBSA analysis:
>
> $AMBERHOME/bin/MMPBSA.py -O -i mmpbsa.in -o FINAL_RESULTS_MMPBSA.dat -sp
> complex_sol.prmtop -cp complex.prmtop -rp receptor.prmtop -lp ligand.prmtop
> -y prodn.mdcrd
>
> *
> Following is the output I am getting:*
>
> ptraj found! Using /home/ppilab/AMBER10/exe/ptraj
> sander found! Using /home/ppilab/AMBER10/exe/sander
> Assuming /home/ppilab/AMBER10/exe/sander is part of
> amber9 or amber10. Using old PB input file.
> Warning: igb=2 should be used with mbondi2 pbradii set. Yours are modified
> Bondi radii (mbondi)
>
> Preparing trajectories with ptraj...
> 50 frames were read in and processed by ptraj for use in calculation.
>
> Starting calculations...
>
> Starting gb calculation...
>
> calculating ligand contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> calculating receptor contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> calculating complex contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> Starting pb calculation...
>
> calculating ligand contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> calculating receptor contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> calculating complex contribution...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> Calculations complete. Writing output file(s)...
>
> Error: No potential terms in sander output! Check output files.
> NOTE: All files have been retained for debugging purposes. Type MMPBSA.py
> --clean to erase these files.
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>



-- 
Jason M. Swails
Quantum Theory Project,
University of Florida
Ph.D. Candidate
352-392-4032
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Mar 07 2011 - 09:30:04 PST
Custom Search