Re: memory shortage with mm_pbsa - some more thoughts

From: Joffre Heredia <joffre.yogi.uab.es>
Date: Sat, 12 Jul 2003 12:26:39 +0200

Dear Mr. Anderson,

We are having the same trouble as you with the Nmode memory limits. It
seems that, as with other Amber modules, Nmode does not allocate memory
dinamically at run time but statically at compilation time. Hence you
should define the size of memory you want to allocate in the 'sizes.h'
file, like Dr. Case already said.
I consider this process to be a bit 'outdated' nowadays but that's the
way it is (just imagine you had to recompile MS Access every time you
needed a bigger database).

Also it seems that Nmode is allocating a huge amount of memory because of
the creation of an intermediate matrix, so about 350MB are needed for a
3.000 atoms system and about 900MB for 5.000 atoms.

This should not be a problem with the actual cheap prices of memory, but
the main problem we are having is that our compiler (SGI Fortran 90
compiler) does not allow to STATICALLY allocate more than 258 MB of
memory. This way, even with a great amount of memory available, we are
not allowed to compile Nmode to use more than about 2.000 or 2.500 atoms.

I don't know if the programmers of Nmode are already aware of this issue
but I guess it is something common in almost all compilers, since
allocating so many megabytes of memory statically is not a suggested
procedure.

Now, let's suppose I modify the file 'sizes.h' to allow big systems of
5.000 atoms, therefore the computer would be allocating about 900MB.
Then, if someone uses this Nmode version with a little system of 100
atoms, would the computer be allocating 900MB again?. If the
answer is NO, why don't we all of us set this number to the maximum and
problems with sizes.h will be solved forever?. If the answer is
YES, let's hope they change this in a near future.

Regards

-------------------------------------------------------------
Joffre Heredia Rodrigo Tel: (34)-93-5813812
Laboratory of Computational Medicine Fax: (34)-93-5812344
Biostatistics Dept.
UAB School of Medicine. Bellaterra Joffre.Heredia.uab.es
08193-Barcelona (SPAIN)
-------------------------------------------------------------

On Tue, 8 Jul 2003, Peter Anderson wrote:

> Dear Amber Users,
>
> I am trying to perform an entropy calculation using
> the mm_pbsa command and I'm encountering memory
> problems. I enter the mm_pbsa.pl command, some
> calculations with sander work fine for about a minute,
> and then an nmode calculation begins and quickly
> fails. The bottom of the resulting "nmode_lig.1.out"
> file reads:
>
> Total memory required : 15425318 real words
>
> Total memory avail : 3500000 real words
>
> Total memory required : 71522 integer words
>
> Total memory avail : 4000000 integer words
>
> Maximum nonbond pairs 3928477
> increase the real memory by 11925318 words
>
> I assume that the mm_pbsa's limitation to a single cpu
> (am I correct?) is the root of this memory shortage.
> Does anyone know how I can get around this memory
> problem? Can I spread the job out onto multiple
> processors?
>
> Thank you very much,
> Peter Anderson
>
> __________________________________
> Do you Yahoo!?
> SBC Yahoo! DSL - Now only $29.95 per month!
> http://sbc.yahoo.com
>
>
Received on Fri Jul 11 2003 - 11:53:01 PDT
Custom Search