Apologies.
In the previous message mpirun -np was set to 12.
Setting -np to 4 does not produce the same error. sander.MPI works.
I've run a minimisation with -np set to 12 without a problem.
I'd appreciate any suggestions on this behaviour.
Best regards
George
On Dec 12, 2010, at 2:59 PM, George Tzotzos wrote:
> Hi everybody,
>
> on running
>
> mpirun -np sander.MPI -O -i heat.in -o heat.out -p complex.prmtop -c min.rst -r heat.rst -x heat.mdcrd -ref min.rst
>
> I get the following error:
>
>
> Fatal error in MPI_Reduce_scatter: Internal MPI error!, error stack:
> MPI_Reduce_scatter(1265)......: MPI_Reduce_scatter(sbuf=0x10994f698, rbuf=0x1099e26f8, rcnts=0x104e09f58, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD) failed
> MPIR_Reduce_scatter_impl(1122):
> MPIR_Reduce_scatter_impl(1088):
> MPIR_Reduce_scatter_intra(595):
> MPIR_Localcopy(349)...........: memcpy arguments alias each other, dst=0x1099e26f8 src=0x1099e26f8 len=54936
> APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)
>
> My input file is:
>
> heat 3ogn-3og
> &cntrl
> imin=0,irest=0,ntx=1,
> nstlim=25000,dt=0.002,
> ntc=2,ntf=2,
> cut=8.0, ntb=1,
> ntpr=500, ntwx=500,
> ntt=3, gamma_ln=2.0, ig=-1,
> tempi=0.0, temp0=300.0,
> ntr=1, restraintmask=':1-138',
> restraint_wt=2.0,
> nmropt=1
> /
> &wt TYPE='TEMP0', istep1=0, istep2=25000,
> value1=0.1, value2=300.0, /
> &wt TYPE='END' /
>
>
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Sun Dec 12 2010 - 06:30:04 PST