Re: [AMBER] NAB nmode memory requirements

From: David A Case <case.biomaps.rutgers.edu>
Date: Mon, 10 Dec 2012 14:14:15 -0500

On Mon, Dec 10, 2012, Ryan Pavlovicz wrote:

> I am trying to do some nmode calculations with nab (mpinab) on a system of
> 11,795 atoms.
>
> >From what i understand, the nmode calculations would calculate a Hessian of
> 3Nx3N elements, and each element would take 8 bytes.
>
> Therefore, my system of 11,795 atoms, 35,385 coordinates, and 1.25 million
> matrix elements would need ~10 GB of memory. However, my testing on
> supercomputer nodes with 48 GB of physical memory have been confusing. Some
> of these jobs fail once they pass to the Newton Raphson part of the
> calculation and memory requirements exceed 48 GB. However some of my jobs
> are successful in completing a couple iterations of NR minimization before
> walltime runs out. When monitoring these successful jobs, i see that they
> are using ~211 GB of memory to complete these calculations.
>
> Is my estimation of required memory off by ~200 GB or is there a potential
> problem with the mpinab executable i am using?

I'd suggest avoiding the NR step: you can generally minimize to an acceptably
low gradient just with xmin: just be patient (i.e. expect to use lots of steps
of minimization.) This will be faster and will avoid potentially needing
space for both NR and normal modes in the same job.

If I remember correctly, you can use putxyz() and getxyz() to save coordinates
in high-precision, so you can minimize in one job, then read those coords back
in to continue with a vibrational calculation in a separate job.

...dac

p.s.: you have *lots* of atoms. I personally have never done a normal mode
calculation this big.


_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Dec 10 2012 - 11:30:02 PST
Custom Search