#!/bin/bash ###################################################################################### # MIO SUBMISSION SCRIPT :: SCR = /tmp ###################################################################################### #PBS -l walltime=120:00:00,nodes=4:ppn=8:compute #PBS -N 25_FA+TOL_mmpbsa #PBS -e stderr #PBS -o stdout #added this line Exports all environment variables from the submitting shell into the \batch shell #PBS -V set echo ###################################################################################### # Set environment variables ###################################################################################### export WORKDIR=/panfs/storage/scratch/vbharadw/RUN2/mmpbsa/dielec25 #added this line so that I did not overwrite your directory #export WORKDIR=$PBS_O_WORKDIR # ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # Create scratch directory & copy data files there & execute & copy back when done # ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ cd $WORKDIR cat $PBS_NODEFILE > $WORKDIR/NODELIST set NPRCS = `wc -l < $WORKDIR/NODELIST` echo E $NPRCS set success = 0 #mpiexec should be used instead of mpirun #TI job #mpiexec -np 8 sander.MPI -ng -groupfile TIgroupfile && set success = 1 #standard MD mpiexec -np 32 MMPBSA.py.MPI -O -i mmpbsa.in -sp fa.tol.bssa.top -cp complex.prmtop -rp receptor.prmtop -lp ligand.prmtop -y ../../nvt4/fa.tol.bssa.mdcrd, ../../nvt5/fa.tol.bssa.mdcrd, ../../nvt6/fa.tol.bssa.mdcrd, ../../nvt7/fa.tol.bssa.mdcrd, ../../nvt8/fa.tol.bssa.mdcrd > progress.log #This is for a serial AMBER 11 job #sander -O -i md.in -o md.out -p bssa_model1_solv.top -c bssa_model1_solv.rst -r bssa_model1_solv_md_cut12.rst -x bssa_model1_solv_md_cut12.mdcrd2 unset echo