Re: [AMBER] MMPBSA Error (Topology I think)

From: Ray Luo via AMBER <amber.ambermd.org>
Date: Sat, 18 Jan 2025 14:36:01 -0800

Vishwaa,

Can you share a full mdout file for one of the failed snapshots?

My gut feeling is that you are using the default fillratio of 4 for a very
large protein dimer and the memory usage is over the physical limit on your
compute nodes.

All the best,
Ray
--
Ray Luo, Ph.D.
Professor of Structural Biology/Biochemistry/Biophysics,
Chemical and Materials Physics, Chemical and Biomolecular Engineering,
Biomedical Engineering, and Materials Science and Engineering
Department of Molecular Biology and Biochemistry
University of California, Irvine, CA 92697-3900
On Sat, Jan 18, 2025 at 1:34 PM Vishwaa Kannan via AMBER <amber.ambermd.org>
wrote:
> The MMPBSA is not working properly. After narrowing it down, I suspect the
> problem lies with the protein being counted as two separate molecules (prob
> the dimer being weird), and the ligand being recognized as another, which
> could be causing the errors. I figured this out using cpptraj and list,
> molecule 1: 1-313, molecule 2: 314 - 627, molecule 3 (PFOA): 628. I tried
> explicitly stating it, but that did not work. What can I do to fix this
> error?
>
> Input file (mmpbsa.in):
> &general
>   startframe=601, endframe=2000,
>   strip_mask=":WAT,Na+",
>   receptor_mask=":1-627",
>   ligand_mask=":628",
>   keep_files=2,
> /
> &pb
>   istrng=0.100,
> /
> &decomp
>   idecomp=1,
> /
>
> Terminal:
> vishwaa.BCC-A102281:~/amber/TS/R1$ mpirun -np 20 MMPBSA.py.MPI -O -i
> mmpbsa.in -o FINAL_RESULTS_MMPBSA.dat -sp R1_solvated.prmtop -cp
> R1.prmtop -rp R1_rec.prmtop -lp R1_lig.prmtop -y prod.mdcrd
> Loading and checking parameter files for compatibility...
> cpptraj found! Using /home/vishwaa/amber/amber24/bin/cpptraj
> sander found! Using /home/vishwaa/amber/amber24/bin/sander
> Preparing trajectories for simulation...
> 1400 frames were processed by cpptraj for use in calculation.
>
> Running calculations on normal system...
>
> Beginning PB calculations with /home/vishwaa/amber/amber24/bin/sander
>   calculating complex contribution...
>   File "/home/vishwaa/amber/amber24/bin/MMPBSA.py.MPI", line 101, in
> <module>
>     app.run_mmpbsa() # for membrane runs by automatically setting the
> membrane thickness and location.
>     ^^^^^^^^^^^^^^^^
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/main.py",
> line 225, in run_mmpbsa
>     self.calc_list.run(rank, self.stdout)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 82, in run
>     calc.run(rank, stdout=stdout, stderr=stderr)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 476, in run
>     raise CalcError('%s failed with prmtop %s!\n\t' % (self.program,
> CalcError: /home/vishwaa/amber/amber24/bin/sander failed with prmtop
> R1.prmtop!
>
>
> Error occurred on rank 6.
>
> Fatal Error!
> All files have been retained for your error investigation:
> You should begin by examining the output files of the first failed
> calculation.
> Consult the "Temporary Files" subsection of the MMPBSA.py chapter in the
> manual for file naming conventions.
>   File "/home/vishwaa/amber/amber24/bin/MMPBSA.py.MPI", line 101, in
> <module>
>     app.run_mmpbsa() # for membrane runs by automatically setting the
> membrane thickness and location.
>     ^^^^^^^^^^^^^^^^
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/main.py",
> line 225, in run_mmpbsa
>     self.calc_list.run(rank, self.stdout)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 82, in run
>     calc.run(rank, stdout=stdout, stderr=stderr)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 476, in run
>     raise CalcError('%s failed with prmtop %s!\n\t' % (self.program,
> CalcError: /home/vishwaa/amber/amber24/bin/sander failed with prmtop
> R1.prmtop!
>
>
> Error occurred on rank 11.
>
> Fatal Error!
> All files have been retained for your error investigation:
> You should begin by examining the output files of the first failed
> calculation.
> Consult the "Temporary Files" subsection of the MMPBSA.py chapter in the
> manual for file naming conventions.
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>   File "/home/vishwaa/amber/amber24/bin/MMPBSA.py.MPI", line 101, in
> <module>
>     app.run_mmpbsa() # for membrane runs by automatically setting the
> membrane thickness and location.
>     ^^^^^^^^^^^^^^^^
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/main.py",
> line 225, in run_mmpbsa
>     self.calc_list.run(rank, self.stdout)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 82, in run
>     calc.run(rank, stdout=stdout, stderr=stderr)
>   File
> "/home/vishwaa/amber/amber24/lib/python3.12/site-packages/MMPBSA_mods/calculation.py",
> line 476, in run
>     raise CalcError('%s failed with prmtop %s!\n\t' % (self.program,
> [BCC-A102281:10455] 1 more process has sent help message help-mpi-api.txt
> / mpi-abort
> [BCC-A102281:10455] Set MCA parameter "orte_base_help_aggregate" to 0 to
> see all help / error messages
> vishwaa.BCC-A102281:~/amber/TS/R1$
>
> Output of rank 6 ("MMPBSA_complex_pb.mdout.6"):
>           -------------------------------------------------------
>           Amber 24 SANDER                              2024
>           -------------------------------------------------------
>
> | Run on 01/18/2025 at 16:10:12
>
> |   Executable path: /home/vishwaa/amber/amber24/bin/sander
> | Working directory: /home/vishwaa/amber/TS/R1
> |          Hostname: Unknown
>   [-O]verwriting output
>
> File Assignments:
> |  MDIN: _MMPBSA_pb_decomp_com.mdin
>
> | MDOUT: _MMPBSA_complex_pb.mdout.6
>
> |INPCRD: _MMPBSA_dummycomplex.inpcrd
>
> |  PARM: R1.prmtop
>
> |RESTRT: _MMPBSA_restrt.6
>
> |  REFC: refc
>
> | MDVEL: mdvel
>
> | MDFRC: mdfrc
>
> |  MDEN: mden
>
> | MDCRD: mdcrd
>
> |MDINFO: mdinfo
>
> |  MTMD: mtmd
>
> |INPDIP: inpdip
>
> |RSTDIP: rstdip
>
> |INPTRA: _MMPBSA_complex.mdcrd.6
>
>
>
>  Here is the input file:
>
> File generated by MMPBSA.py
>
> &cntrl
>
>  ntb=0, nsnb=99999, cut=999.0, imin=5,
>
>  igb=10, idecomp=1, ipb=2,
>
>  dec_verbose=0,
>
> /
>
> &pb
>
>  istrng=100.0, radiopt=0, maxitn=1000,
>
>  fillratio=4.0,
>
> /
>
> Residues considered as REC
>
> RRES 1 627
>
> END
>
> Residues considered as LIG
>
> LRES 628 628
>
> END
>
> Residues to print
>
> RES 1 628
>
> END
>
> END
>
>
>
> --------------------------------------------------------------------------------
>    1.  RESOURCE   USE:
>
> --------------------------------------------------------------------------------
>
> | Flags:
>
> | New format PARM file being parsed.
> | Version =    1.000 Date = 01/17/25 Time = 01:12:41
>  NATOM  =   10051 NTYPES =      19 NBONH =    4986 MBONA  =    5196
>  NTHETH =   11380 MTHETA =    7050 NPHIH =   23035 MPHIA  =   22015
>  NHPARM =       0 NPARM  =       0 NNB   =   55658 NRES   =     628
>  NBONA  =    5196 NTHETA =    7050 NPHIA =   22015 NUMBND =      74
>  NUMANG =     174 NPTRA  =     199 NATYP =      38 NPHB   =       0
>  IFBOX  =       0 NMXRS  =      25 IFCAP =       0 NEXTRA =       0
>  NCOPY  =       0
>
>  Implicit solvent radii are H(N)-modified Bondi radii (mbondi2)
>
> | CMAP information read from topology file:
>
> |     Memory Use     Allocated
> |     Real             1557822
> |     Hollerith          30783
> |     Integer           583328
> |     Max Pairs              1
> |     nblistReal             0
> |     nblist Int             0
> |       Total            14569 kbytes
>
> | Note: 1-4 EEL scale factors are being read from the topology file.
>
> | Note: 1-4 VDW scale factors are being read from the topology file.
> | Duplicated    0 dihedrals
> | Duplicated    0 dihedrals
> |CMAP: Reticulating splines.
>
>
> --------------------------------------------------------------------------------
>    2.  CONTROL  DATA  FOR  THE  RUN
>
> --------------------------------------------------------------------------------
>
> default_name
>
>
> General flags:
>      imin    =       5, nmropt  =       0
>
> Nature and format of input:
>      ntx     =       1, irest   =       0, ntrx    =       1
>
> Nature and format of output:
>      ntxo    =       2, ntpr    =      50, ntrx    =       1, ntwr    =
>    1
>      iwrap   =       0, ntwx    =       0, ntwv    =       0, ntwe    =
>    0
>      ioutfm  =       1, ntwprt  =       0, idecomp =       1, rbornstat=
>     0
>
> Potential function:
>      ntf     =       1, ntb     =       0, igb     =      10, nsnb    =
>  99999
>      ipol    =       0, gbsa    =       0, iesp    =       0
>      dielc   =   1.00000, cut     = 999.00000, intdiel =   1.00000
>
> Frozen or restrained atoms:
>      ibelly  =       0, ntr     =       0
>
> Energy minimization:
>      maxcyc  =       1, ncyc    =      10, ntmin   =       1
>      dx0     =   0.01000, drms    =   0.00010
>
>     LOADING THE DECOMP ATOMS AS GROUPS
>
>     ----- READING GROUP     1; TITLE:
>  Residues considered as REC
>
>       Number of atoms in this group  =     0
>     ----- READING GROUP     2; TITLE:
>  Residues considered as LIG
>
>       Number of atoms in this group  =     0
>     ----- READING GROUP     3; TITLE:
>  Residues to print
>
>  GRP    3 RES    1 TO   628
>       Number of atoms in this group  = 10051
>     ----- END OF GROUP READ -----
> There was more stuff here, but just a bunch of atoms and their info.
> --
> Best,
> Vishwaa Kannan
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
>
> https://urldefense.com/v3/__http://lists.ambermd.org/mailman/listinfo/amber__;!!CzAuKJ42GuquVTTmVmPViYEvSg!KA1gbOHrzJndwouoXWeg_zlAH_xRl3394lQ-zQK2JOo9J9Oos1id0xxXPkSzHi9dihXmnRY8zoI$
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Sat Jan 18 2025 - 15:00:03 PST
Custom Search