Hi Bill,
Thank you for your help. I will recompile then let you know how it goes.
Best regards,
Seungyeul Yoo
-----Original Message-----
From: Bill Miller III [mailto:brmilleriii.gmail.com]
Sent: Saturday, July 23, 2011 8:58 AM
To: AMBER Mailing List
Subject: Re: [AMBER] Problem with MMPBSA.py
Yes, MMPBSA.MPI executes MMPBSA.py.MPI by using the python compiled
alongside Amber to prevent inconsistencies. So you should have an MMPBSA.py
and MMPBSA.py.MPI in the directory $AMBERHOME/AmberTools/src/mmpbsa_py/. If
not, you will need to re-compile.
-Bill
On Sat, Jul 23, 2011 at 2:07 AM, Seungyeul Yoo <seungyeul.yoo.me.com> wrote:
> Hi Jason,
>
> I do have MMPBSA and MMPBSA.MPI in $AMBERHOME/bin folder.
>
> And MMPBSA.MPI contains only following lines
>
>
>
----------------------------------------------------------------------------
--------------------------------------------------
> #!/bin/sh
> if [ -z $AMBERHOME ]; then
> echo "Error: AMBERHOME must be set for MMPBSA!"
> exit 1
> fi
>
> if [ ! -f $AMBERHOME/AmberTools/src/mmpbsa_py/MMPBSA.py.MPI ]; then
> echo "Error: Can't find MMPBSA.py.MPI! Re-install MMPBSA.py in
parallel."
> exit 1
> fi
>
> $AMBERHOME/bin/python $AMBERHOME/AmberTools/src/mmpbsa_py/MMPBSA.py.MPI $*
>
>
----------------------------------------------------------------------------
--------------------------------------------------
>
> Is this what you think I need to run instead of MMPBSA.py.MPI?
>
> When I used this MMPBSA.MPI, the job was again failed with error message.
>
>
>
----------------------------------------------------------------------------
--------------------------------------------------
> Error: Can't find MMPBSA.py.MPI! Re-install MMPBSA.py in parallel.
> Error: Can't find MMPBSA.py.MPI! Re-install MMPBSA.py in parallel.
> Error: Can't find MMPBSA.py.MPI! Re-install MMPBSA.py in parallel.
> Error: Can't find MMPBSA.py.MPI! Re-install MMPBSA.py in parallel.
>
>
----------------------------------------------------------------------------
--------------------------------------------------
>
> Since there is no MMPBSA.py.MPI file in the path of
> $AMBERHOME/AmberTools/src/mmpbsa_py/ .
>
> Is the MMPBSA.py.MPI called in MMPBSA.MPI different from MMPBSA.py.MPI?
>
> If so, I would ask our computer administrator to reinstall MMPBSA.py in
the
> path.
>
> If there is anything to be careful here please let me know.
>
> Thank you so much for your help.
>
> Seungyeul Yoo
>
>
>
> On Jul 22, 2011, at 9:30 PM, Jason Swails wrote:
>
> > Try using the AmberTools 1.5 version instead of the previous version.
In
> > AmberTools 1.5, the correct name of the executable is MMPBSA.MPI, not
> > MMPBSA.py.MPI. This may be causing your issues.
> >
> > Try replacing MMPBSA.py.MPI with MMPBSA.MPI and see if that helps.
> >
> > If you do not have MMPBSA and MMPBSA.MPI in $AMBERHOME/bin, then
> AmberTools
> > 1.5 wasn't installed properly. If you do, then delete MMPBSA.py.MPI and
> > MMPBSA.py from $AMBERHOME/bin to avoid any confusion. I thought this
was
> > done automatically, but I must have missed that.
> >
> > HTH,
> > Jason
> >
> > On Fri, Jul 22, 2011 at 6:15 PM, Seungyeul Yoo <Seungyeul.Yoo.mssm.edu
> >wrote:
> >
> >> Hi all,
> >>
> >> I'm using MMPBSA.py script to calculate binding affinity.
> >>
> >> I use the same topology files for -sp and -cp since I already stripped
> all
> >> the water molecules and ions using trajout command in ptraj.
> >>
> >> The input file for running GB and PB is following:
> >>
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >> &general
> >> verbose=2, entropy=0, strip_mdcrd=1
> >> startframe=60, endframe=6000, interval=60
> >> /
> >> &decomp
> >> idecomp=2, dec_verbose=3
> >> print_res="21-33;43-48;62-83;90-110;138-149;155-161;290-292"
> >> /
> >> &gb
> >> igb=5, saltcon=0.150
> >> /
> >> &pb
> >> istrng=0.150
> >> /
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >>
> >> Then I submit this input file in PBS server using following commands:
> >>
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/share/apps/intel/lib/intel64
> >> export AMBERHOME=/share/apps/amber11
> >> export
> >>
> PYTHONPATH=/share/apps/python2.6/lib/python2.6/site-packages\:$PYTHONPATH
> >> MMPBSA="/share/apps/amber11/bin/MMPBSA.py.MPI"
> >> PYTHON="/share/apps/python27/bin/python2.7"
> >>
> >> cd $PBS_O_WORKDIR
> >>
> >> SUB="mpirun -np 4 $PYTHON $MMPBSA -O -i decomp_lig.in -o decomp_lig.dat
> >> -sp ../Analysis/plk2_168_com.top -cp ../Analysis/plk2_168_com.top -rp
> >> ../Analysis/plk2_168_rec.top -lp ../Analysis/plk2_168_lig.top -y
> >> ../Analysis/plk2_168_dck_now.trj -do plk2_168_decomp.dat"
> >> echo $SUB
> >> $SUB
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >>
> >> The command finished with error having log of
> >>
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >> mpirun -np 4 /share/apps/python27/bin/python2.7
> >> /share/apps/amber11/bin/MMPBSA.py.MPI -O -i decomp_lig.in -o
> >> decomp_lig.dat -sp ../Analysis/plk2_168_com.top -cp
> >> ../Analysis/plk2_168_com.top -rp ../Analysis/plk2_168_rec.top -lp
> >> ../Analysis/plk2_168_lig.top -y ../Analysis/plk2_168_dck_now.trj -do
> >> plk2_168_decomp.dat
> >> MMPBSA.py.MPI being run on 4 processors
> >> ptraj found! Using /share/apps/amber11/exe/ptraj
> >> sander found! Using /share/apps/amber11/exe/sander
> >>
> >> Preparing trajectories with ptraj...
> >> 100 frames were read in and processed by ptraj for use in calculation.
> >>
> >> Starting calculations
> >>
> >> Starting gb calculation...
> >>
> >> calculating ligand contribution...
> >> calculating receptor contribution...
> >> calculating complex contribution...
> >> Starting pb calculation...
> >>
> >> calculating ligand contribution...
> >> calculating receptor contribution...
> >> calculating complex contribution...
> >>
> >> Calculations complete. Writing output file(s)...
> >> Traceback (most recent call last):
> >> File "/share/apps/amber11/bin/MMPBSA.py.MPI", line 1571, in <module>
> >> decompout, idecomp, dec_verbose, ligstart, decomprun, surften,
> >> cavity_surften, temp)
> >> File "/share/apps/amber11/bin/utils.py", line 4647, in
PrintFinalResults
> >> '',finaloutput,debug,numframes,sander_apbs,one_trajectory,verbose)
> >> File "/share/apps/amber11/bin/utils.py", line 2248, in pboutput
> >> bonddif[x] = bonddif[x] - bond[x]
> >> IndexError: list index out of range
> >>
> >>
>
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------
> >>
> >> Therefore in the output file, only GB information is printed and no PB
> >> results.
> >>
> >> Strangely, I have another system with point mutation and run the
> MMPBSA.py
> >> without error.
> >>
> >> I checked the version of the versions of all packages such as
> "AMBERTOOLS
> >> 1.5" and they were all updated.
> >>
> >> Can I have any advices where I need to look at to solve this problem?
> >>
> >> Thanks,
> >>
> >> Seungyeul Yoo
> >>
> >> _______________________________________________
> >> AMBER mailing list
> >> AMBER.ambermd.org
> >> http://lists.ambermd.org/mailman/listinfo/amber
> >>
> >
> >
> >
> > --
> > Jason M. Swails
> > Quantum Theory Project,
> > University of Florida
> > Ph.D. Candidate
> > 352-392-4032
> > _______________________________________________
> > AMBER mailing list
> > AMBER.ambermd.org
> > http://lists.ambermd.org/mailman/listinfo/amber
>
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
--
Bill Miller III
Quantum Theory Project,
University of Florida
Ph.D. Graduate Student
352-392-6715
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Sat Jul 23 2011 - 08:30:02 PDT