[AMBER] Multiple trajectory MMPBSA

From: Yang Wei via AMBER <amber.ambermd.org>
Date: Fri, 15 Mar 2024 16:37:06 -0400

Hi ,

I am running Multiple trajectory MMPBSA.py.MPI for different systems. For
one of them I got the error:

Running calculations on normal system...

Beginning GB calculations with /usr/public/amber/22/bin/mmpbsa_py_energy
  calculating complex contribution...
  calculating receptor contribution...
  calculating ligand contribution...

Beginning PB calculations with /usr/public/amber/22/bin/mmpbsa_py_energy
  calculating complex contribution...
  calculating receptor contribution...
  calculating ligand contribution...

Beginning quasi-harmonic calculations with /usr/public/amber/22/bin/cpptraj
Internal Error: from dspev: The algorithm failed to converge.
395 off-diagonal elements of an intermediate tridiagonal form
did not converge to zero.
Error: In Analysis [matrix]
Internal Error: from dspev: The algorithm failed to converge.
6140 off-diagonal elements of an intermediate tridiagonal form
did not converge to zero.
Error: In Analysis [matrix]
Error: Error(s) occurred during execution.
  File "/usr/public/amber/22//bin/MMPBSA.py.MPI", line 100, in <module>
    app.run_mmpbsa()
  File
"/usr/public/amber/22/lib/python3.11/site-packages/MMPBSA_mods/main.py",
line 224, in run_mmpbsa
    self.calc_list.run(rank, self.stdout)
  File
"/usr/public/amber/22/lib/python3.11/site-packages/MMPBSA_mods/calculation.py",
line 82, in run
    calc.run(rank, stdout=stdout, stderr=stderr)
  File
"/usr/public/amber/22/lib/python3.11/site-packages/MMPBSA_mods/calculation.py",
line 419, in run
    Calculation.run(self, rank, stdout=self.output)
  File
"/usr/public/amber/22/lib/python3.11/site-packages/MMPBSA_mods/calculation.py",
line 156, in run
    raise CalcError('%s failed with prmtop %s!' % (self.program,
CalcError: /usr/public/amber/22/bin/cpptraj failed with prmtop
SMARCA2_Degrader_dry.prmtop!
Error occurred on rank 3.

Fatal Error!
All files have been retained for your error investigation:
You should begin by examining the output files of the first failed
calculation.
Consult the "Temporary Files" subsection of the MMPBSA.py chapter in the
manual for file naming conventions.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Job Finished


The GB and PB components performed satisfactorily, but encountered issues
during quasi-harmonic calculations. Any insights or suggestions you can
provide would be greatly appreciated.

Best,

Yang
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Mar 15 2024 - 14:00:02 PDT
Custom Search