Thanks Tyler and Jason for your replies,
I've followed your advice. First I've run with a standard prmtop and the
problem persisted. Parallel and serial MMPBSA.py script gave rise to the
same problem. The last lines of _MMPBSA_complex_rism.mdout.0 do not show
any thing abnormal. Finally I've checked the memory issue and indeed this
was the origin of the problem. Polardecomp=1 requires significant amount of
memory. So, problem solved!
Thank you very much,
Campa,
2015-10-22 21:33 GMT+02:00 Josep Maria Campanera Alsina <campaxic.gmail.com>
:
> Dear all,
> I get the following error when trying to decompose (polardecomp=1) the
> solvation free energy from 3D-RISM in MMPBSA.py. The calculation actually
> does a couple of frames and finally crashes.
>
>
> ----------------------------------------------------------------------------------------
> Beginning 3D-RISM calculations with
> /Users/campa/Amber/amber14/bin/rism3d.snglpnt
> calculating complex contribution...
> File "/Users/campa/Amber/amber14/bin/MMPBSA.py.MPI", line 96, in <module>
> app.run_mmpbsa()
> File "/Users/campa/Amber/amber14/bin/MMPBSA_mods/main.py", line 218, in
> run_mmpbsa
> self.calc_list.run(rank, self.stdout)
> File "/Users/campa/Amber/amber14/bin/MMPBSA_mods/calculation.py", line
> 79, in run
> calc.run(rank, stdout=stdout, stderr=stderr)
> File "/Users/campa/Amber/amber14/bin/MMPBSA_mods/calculation.py", line
> 269, in run
> Calculation.run(self, rank, stdout=self.output % rank)
> File "/Users/campa/Amber/amber14/bin/MMPBSA_mods/calculation.py", line
> 148, in run
> self.prmtop))
> CalcError: /Users/campa/Amber/amber14/bin/rism3d.snglpnt failed with
> prmtop AB40.WT-WT.nowat.ildn.pdb.charmm.pdb.psfgen.pdb.amber.top!
> Error occured on rank 0.
> Exiting. All files have been retained.
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>
> ----------------------------------------------------------------------------------------
>
> with the simple following input:
>
> ------------------------------------------
> &general
> startframe=1,endframe=5000,interval=500,verbose=2, keep_files=2, netcdf=1,
> full_traj=1, use_sander=0,
> /
> &rism
> thermo="gf",polardecomp=1,rism_verbose=2,
> /
> ------------------------------------------
>
> The error appears either in serial or parallel calculations in MMPBSA.py
> script of AMBER14. I use the ff99SB-ILDN forcefield and the topology file
> was created originally with chamber but finally converted to "nornal" amber
> topology file with cpptraj facility with the option "parmwrite out ...
> nochamber". The same calculation without the option "polardecomp=1" ends
> correctly. As it seems a problem related with settings of radii (CalcError)
> I've tried to change the radii using parmed but the problem persists.
>
> All comments welcome and thank you very much for the help,
>
> Campa,
>
--
---------------------------------------------
Josep Maria Campanera Alsina
Professor Agregat Interí
Departament de Fisicoquímica
Facultat de Farmàcia
Universitat de Barcelona
Avgda Joan XXIII, s/n
08028 Barcelona
Tel: +34 93 4035985
Fax: +34 93 4035987
campanera.ub.edu
http://campanerablog.wordpress.com <https://campanerablog.wordpress.com/>
http://science.lsi.upc.edu
--------------------------------------------
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue Oct 27 2015 - 06:00:02 PDT