On Mon, Jul 26, 2010, Shubhra Gupta wrote:
>
> We have a sun cluster system of sun fire x4150 server with a master node and
> 4 compute node and mpi for parallel process with Amber 10. When we run
> sander.MPI we got the following error:
>
>
> Fatal error in MPI_Allgather: Invalid communicator, error stack:
> MPI_Allgather(865): MPI_Allgather(sbuf=0x2a955a5f30, scount=8415,
> MPI_DOUBLE_PRECISION, rbuf=0x5d748f0, rcount=8415, MPI_DOUBLE_PRECISION,
> comm=0x0) failed
> MPI_Allgather(771): Invalid communicator[cli_0]: aborting job:
> Fatal error in MPI_Allgather: Invalid communicator, error stack:
> MPI_Allgather(865): MPI_Allgather(sbuf=0x2a955a5f30, scount=8415,
> MPI_DOUBLE_PRECISION, rbuf=0x5d748f0, rcount=8415, MPI_DOUBLE_PRECISION,
> comm=0x0) failed
> MPI_Allgather(771): Invalid communicator
>
> I am also attaching the input file and various other files with which the
> error is associated.
Does your setup pass the parallel test suite? Knowing that helps us
distinguish between generic problems and problems specific to your particular
job.
We will also want to know what flags you gave to the configure_amber script,
and which compilers you are using.
....dac
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Jul 26 2010 - 05:00:04 PDT