Re: [AMBER] estimate memory requirements for

From: Vlad Cojocaru <>
Date: Tue, 12 Nov 2013 09:17:44 +0100

Thanks a lot Jason for this detailed explanation ...

It seems quite complicated ... The numbers you mention at the end would
be per core or per job ?

I am asking because I am trying to run a job on a cluster on 128 cores
and my job exited after trying to use 623 GB of memory (available only 4
GB/core = 512 GB) ... Well, its true it is a job with non-linear Poisson
Boltzmann and 0.25 grid spacing on a system that is not large but its
not compact either so the grid would be quite big ... But 623 GB
sounds quite outrageous ... So, I am not sure what actually happens ...

Best wishes

On 11/11/2013 05:50 PM, Jason Swails wrote:
> On Mon, 2013-11-11 at 15:52 +0100, Vlad Cojocaru wrote:
>> Dear all,
>> Is there a way to a priori estimate the memory requirements for an
>> job given the number of atms in the system and the number
>> of frames to process ?
> Depends. For GB and normal modes, you can estimate it. For PB, not
> really/not sure.
> Given N atoms...
> For GB, you need to store 3N doubles for the particle positions, another
> 2N doubles for the atom parameters, and then all of the data structures
> necessary for evaluating the full energy. I believe GB memory
> requirements should scale more or less linearly with the number of atoms
> in the system (you can always plot memory usage vs. system size to see).
> For normal modes, you need slightly more than 3N*3N/2 doubles to store
> the upper-triangular part of the symmetric Hessian matrix in addition to
> the data structures needed to evaluate the total energy (which will
> scale linearly with N). The major expense here is the N^2 scaling of
> the Hessian storage. (The BLAS/LAPACK routines will also need working
> space in RAM to store temporary data).
> For PB, on the other hand, the memory is dominated primarily by the
> grid, I believe. The grid size depends as much on the shape and
> compactness of the molecule as it does on its size. I am less certain
> of how to estimate RAM usage here. (It also depends strongly on the grid
> spacing, obviously.)
> 3D-RISM is similar in concept to PB in that it requires a grid. If I
> recall conversations I've had with RISM experts, the 3D-RISM grid needs
> to be denser than the corresponding grid for PB, so RAM requirements for
> 3D-RISM are typically a bit higher than PB, although I think the scaling
> behavior is similar.
> For a moderately large system, normal mode calculations can use between
> 2 and 4 GB (even more for very large systems). GB calculations I would
> peg in the 500 MB range, whereas PB calculations are more in the 1-2 GB
> range for large systems.
> HTH,
> Jason

Dr. Vlad Cojocaru
Max Planck Institute for Molecular Biomedicine
Department of Cell and Developmental Biology
Röntgenstrasse 20, 48149 Münster, Germany
Tel: +49-251-70365-324; Fax: +49-251-70365-399
Email: vlad.cojocaru[at]
AMBER mailing list
Received on Tue Nov 12 2013 - 00:30:16 PST
Custom Search