[AMBER] Regarding the MPI error in parallel running of 3D-RISM

From: PRITI ROY <priitii.roy.gmail.com>
Date: Thu, 28 Jun 2018 17:05:13 +0530

Dear All,

I tried to run 3drism in parallel mode in our cluster and showing following
error:
######################################################################

|rism_polarExcessCharge_dT [e]
Total polar_ExChg_dT_O polar_ExChg_dT_H1
|rism_apolarExcessCharge_dT [e]
Total apolar_ExChg_dT_O apolar_ExChg_dT_H1
|rism_polarKirkwoodBuff_dT
[A^3] polar_KB_dT_O polar_KB_dT_H1
|rism_apolarKirkwoodBuff_dT
[A^3] apolar_KB_dT_O apolar_KB_dT_H1
|rism_polarDCFintegral_dT
[A^3] polar_DCFI_dT_O polar_DCFI_dT_H1
|rism_apolarDCFintegral_dT
[A^3] apolar_DCFI_dT_O apolar_DCFI_dT_H1

Processing NetCDF trajectory: dimer_5step_from51ns_cn1.nc

Frame: 1 of 5
[compute-n1:110791] *** An error occurred in MPI_Comm_rank
[compute-n1:110791] *** reported by process [588447745,23]
[compute-n1:110791] *** on communicator MPI_COMM_WORLD
[compute-n1:110791] *** MPI_ERR_COMM: invalid communicator
[compute-n1:110791] *** MPI_ERRORS_ARE_FATAL (processes in this
communicator will now abort,
[compute-n1:110791] *** and potentially your MPI job)
#########################################################################

If I am not wrong and from previous amber mails on this issue, the error
comes due to MPI set up problem. But in the same cluster I am able to run
NAMD/GROMACS in parallel mode.

My submission file is as follows:
#########################################################################3
#!/bin/bash
#PBS -N test_rism
#PBS -l nodes=1:ppn=24
##PBS -q cpu
## (default queue)
#PBS -e error33.file
#PBS -o output33.file
#PBS -r n
##PBS -V
echo PBS JOB id is $PBS_JOBID

echo PBS_NODEFILE is $PBS_NODEFILE

echo PBS_QUEUE is $PBS_QUEUE

cd $PBS_O_WORKDIR

cat $PBS_NODEFILE > pbsnodes

export DO_PARALLEL='mpirun -np 24'
#####
#cd $TMPDIR
#####
##
## Edit this line to reflect your directory.

cd /tmp
mkdir cn1_solv
cd cn1_solv
cp ../1.nc .
cp ../1.pdb .
cp ../par22.prmtop .
cp ../1drism_tp3.xvv .

output=ds1_test3.r3d

nohup mpirun -np 24 -machinefile $PBS_NODEFILE ../rism3d.snglpnt.MPI \
               --pdb 1.pdb \
               --prmtop par22.prmtop \
               -y 1.nc \
               --xvv 1drism_tp3.xvv --closure kh --tolerance 0.000001 \
               --ng 120,90,90 \
               --solvbox 117.16,75.125,88.29 --centering -1\
               --polarDecomp \
               --entropicDecomp \
> $output

mv *.* ../../
###########################################################
Is my submission file correct or not?

Looking forward for your response.

Thanks,
Priti
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Thu Jun 28 2018 - 05:00:01 PDT
Custom Search