Re: AMBER: nmode/nab entropy calculations memory issues

From: Andreas Svrcek-Seiler <svrci.tbi.univie.ac.at>
Date: Tue, 16 Sep 2008 17:50:11 +0200 (CEST)

Hi,
> I have a question relating to entropy calculations using nmode/nab. I am
> trying to calculate entropy of a protein which has around 9000 atoms. I
> encountered memory problems both with nmode and nab.

> Any estimate of how much memory I might need to run this kind of system ?
Your system contains 27000 coordinates, which makes 27000**2 = 729 million
entries, each of which needs 8 bytes, which gives 5.8 billion bytes for
the hessian (though the hessian is symmetric and exploiting this can save
memory - I'm not sure about implementation details).

Anyway: For 6370 atoms, I see 5.9 GB of memory being used (so this
woudln't work within reasonable time on a 4GB machine).
For 9515 atoms I see 13 GB of memory in use. So you might like a 16 GB
machine for this (at least).

...good luck
Andreas
-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber.scripps.edu
To unsubscribe, send "unsubscribe amber" (in the *body* of the email)
      to majordomo.scripps.edu
Received on Wed Sep 17 2008 - 03:10:30 PDT
Custom Search