Re: [AMBER] huge memory consumption when running nab simulations

From: Charo del Genio via AMBER <>
Date: Tue, 6 Jun 2023 17:42:02 +0100

On 06/06/2023 13:32, Dominik Brandstetter via AMBER wrote:
> Dear Amber community,
> I am new to NAB, and I am running some implicit solvent simulations with it on a cluster having 2x 64-core nodes and with ~940 GiB of RAM memory per node.
> I am using the and files attached to start my simulations, which are nicely parallelized and run fine, but I notice a huge memory consumption, which often leads to a failure of the simulations, as you can see in the following error message for the run:
> [...]
> The system I am trying to simulate has 3696 atoms. Do you think this high memory consumption is normal for this size? Or is there a way I can modify e.g. my or file to reduce the RAM usage and have a successful completion of my run?
> Thanks in advance.
> Best regards,
> Dominik

Hi Dominik,
        what version of Amber/AmberTools are you using? If I remember correctly, there was a memory leak in NAB, due to the memory allocated for vectors and matrices not being properly freed, but this was
something like 5 years ago or more...



Dr. Charo I. del Genio
Senior Lecturer in Statistical Physics
Applied Mathematics Research Centre (AMRC)
Design Hub
Coventry University Technology Park
Coventry CV1 5FB
AMBER mailing list
Received on Tue Jun 06 2023 - 10:00:02 PDT
Custom Search