[AMBER] MMPBSA.py.MPI query

From: amit dong <dongamit123.gmail.com>
Date: Mon, 10 Jan 2011 16:30:55 -0600

Hello,

I am using MMPBSA.py.MPI to determine per residue decomposition of deltaG
for a ligand-protein complex.
I have 2 questions:

1. Is there an upper limit to the number of residues for which data can be
printed. I have pasted the input script below. The result shows data only
upto residue #189.

The input script is

Per-residue GB and PB decomposition
&general
   interval=1, endframe=4, verbose=1,
/
&decomp
  idecomp=3, dec_verbose=1,
  print_res="32; 35-39; 111-113; 142; 151-152; 155-157; 189; 205; 207-217;
220; 244-250; 274; 276; 303-306; 635"
/

2. Is there any way to know that the snapshots are actually distributed to
different nodes? Even though I submitted the job (4 snapshots) to 2 nodes ,
the log file says job submitted to 1 processor. I am copying the log file
and the submission script below.

MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
4 frames were read in and processed by ptraj for use in calculation.

Starting calculations

Starting gb calculation...

  calculating ligand contribution...
  calculating receptor contribution...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
MMPBSA.py.MPI being run on 1 processors
ptraj found! Using /amber11/exe/ptraj
sander found! Using /amber11/exe/sander

Preparing trajectories with ptraj...
4 frames were read in and processed by ptraj for use in calculation.

Starting calculations

Starting gb calculation...

  calculating ligand contribution...
  calculating receptor contribution...
  calculating complex contribution...
  calculating complex contribution...

Calculations complete. Writing output file(s)...


Submission script:

#!/bin/sh
#PBS -l walltime=24:00:00
#PBS -l nodes=2:ppn=4
#PBS -V
#PBS -N decomp.py

export AMBERHOME=/amber11

export WORK_DIR=/home/abc
export PYTHONPATH=/home/abc/lib/python2.7/site-packages\:$PYTHONPATH

cd $WORK_DIR
export NPROCS=`wc -l $PBS_NODEFILE |gawk '//{print $1}'`
echo $NPROCS
mpdboot -n 5 -f $HOME/mpd.hosts -v
mpdtrace -l
mpiexec -n $NPROCS $AMBERHOME/bin/MMPBSA.py.MPI -O -i input.in -o
FINAL_RESULTS_MMPBSA.dat -sp complex_wat.prmtop -cp complex.prmtop -rp
receptor.prmtop -lp lig.prmtop -y md.x.gz > output.log
mpdallexit


Thanks!!
AD
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Jan 10 2011 - 15:00:02 PST
Custom Search