Hi Dmitri,
Many thanks for this:
Two tries:
1. sander -O -i smd.in -o smd.out -p 2wc6_bom_solv.prmtop -c prod.rst -r smd.rst -x smd.mdcrd
Gives "segmentation fault"
2. mpirun -np 12 sander.MPI -O -i smd.in -o smd.out -p 2wc6_bom_solv.prmtop -c prod.rst -r smd.rst -x smd.mdcrd
Gives
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x109a4b3d0, scount=6618, MPI_DOUBLE_PRECISION, rbuf=0x109a4b3d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x109a4b3d0 src=0x109a4b3d0 len=52944
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098952a0, scount=4164, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098952a0 src=0x1098952a0 len=33312
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x10989d4c0, scount=4167, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x10989d4c0 src=0x10989d4c0 len=33336
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098a56f8, scount=4167, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098a56f8 src=0x1098a56f8 len=33336
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098ad930, scount=4167, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098ad930 src=0x1098ad930 len=33336
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098b5b68, scount=4158, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098b5b68 src=0x1098b5b68 len=33264
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098bdd58, scount=4167, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098bdd58 src=0x1098bdd58 len=33336
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098c5f90, scount=4158, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098c5f90 src=0x1098c5f90 len=33264
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098ce180, scount=4158, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098ce180 src=0x1098ce180 len=33264
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098d6370, scount=4167, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098d6370 src=0x1098d6370 len=33336
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098de5a8, scount=4158, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098de5a8 src=0x1098de5a8 len=33264
Fatal error in MPI_Allgatherv: Internal MPI error!, error stack:
MPI_Allgatherv(991).......: MPI_Allgatherv(sbuf=0x1098e6798, scount=4158, MPI_DOUBLE_PRECISION, rbuf=0x1098883d0, rcounts=0x104e09f58, displs=0x104e09750, MPI_DOUBLE_PRECISION, MPI_COMM_WORLD) failed
MPIR_Allgatherv_impl(825).:
MPIR_Allgatherv(787)......:
MPIR_Allgatherv_intra(591):
MPIR_Localcopy(349).......: memcpy arguments alias each other, dst=0x1098e6798 src=0x1098e6798 len=33264
APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)
On Jun 27, 2011, at 10:06 PM, Dmitry Nilov wrote:
> Hello,
> I am not sure that it is possible to perform SMD by pmemd. Try to use
> sander.MPI instead of pmemd.MPI.
>
> On Mon, Jun 27, 2011 at 9:49 PM, George Tzotzos <gtzotzos.me.com> wrote:
>> Hi everybody,
>>
>> I've just completed a 10ns production run of a protein-ligand complex. I would like to study the dissociation of this complex and fI thought that SMD would be suitable for this purpose. I tried to repeat the process described in the Amber 11 manual as well as in this website (http://enzyme.fbb.msu.ru/Tutorials/Tutorial_3/).
>>
>> I've started sander from a directory containing the solvated prmtop file, the trajectory from the production run and a distance restraint file.
>>
>> When I run
>>
>> mpirun -np 12 pmemd.MPI -O -i smd.in -o smd.out -p 2wc6_bom_solv.prmtop -c prod.rst -r smd.rst -x smd.mdcrd
>>
>> I get
>>
>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>> APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)
>>
>> smd 2wc6-bombykol
>> &cntrl
>> imin=0,irest=1,ntx=5,
>> nstlim=500000,dt=0.002,
>> ntc=2,ntf=2,
>> cut=8.0, ntb=2, ntp=1, taup=2.0,
>> ntpr=1000, ntwx=1000, ntwr=5000,
>> ntt=3, gamma_ln=2.0, ig=-1,
>> temp0=300.0,
>> jar=1,
>> /
>> &wt TYPE='DUMPFREQ', istep1=1, /
>> &wt TYPE='END', /
>> DISANG=dist.RST
>> DUMPAVE=dist_vs_t
>> LISTIN=POUT
>> LISTOUT=POUT
>>
>> and the distance restraint dist.RST
>>
>> # Change distance restraint between atoms
>> &rst iat=1533,2223, r2=2.0, rk2=1.0725, r2a=8.0 /
>>
>> where 1533 and 2223 are atom numbers derived from the pdb file.
>>
>> Any help would be much appreciated.
>>
>> Regards
>>
>> George
>>
>> _______________________________________________
>> AMBER mailing list
>> AMBER.ambermd.org
>> http://lists.ambermd.org/mailman/listinfo/amber
>>
>
>
>
> --
> Dmitry Nilov
> Faculty of Bioengineering and Bioinformatics,
> Lomonosov Moscow State University
> web: http://enzyme.fbb.msu.ru/
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Jun 27 2011 - 14:00:03 PDT