On 06/06/2023 13:32, Dominik Brandstetter via AMBER wrote:
> Dear Amber community,
> I am new to NAB, and I am running some implicit solvent simulations with it on a cluster having 2x 64-core nodes and with ~940 GiB of RAM memory per node.
> I am using the sim.nab and submit.sh files attached to start my simulations, which are nicely parallelized and run fine, but I notice a huge memory consumption, which often leads to a failure of the simulations, as you can see in the following error message for the run:
> [...]
> The system I am trying to simulate has 3696 atoms. Do you think this high memory consumption is normal for this size? Or is there a way I can modify e.g. my sim.nab or submit.sh file to reduce the RAM usage and have a successful completion of my run?
> Thanks in advance.
> Best regards,
> Dominik
Hi Dominik,
what version of Amber/AmberTools are you using? If I remember correctly, there was a memory leak in NAB, due to the memory allocated for vectors and matrices not being properly freed, but this was
something like 5 years ago or more...
Cheers,
Charo
--
Dr. Charo I. del Genio
Senior Lecturer in Statistical Physics
Applied Mathematics Research Centre (AMRC)
Design Hub
Coventry University Technology Park
Coventry CV1 5FB
UK
https://charodelgenio.weebly.com
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue Jun 06 2023 - 10:00:02 PDT