AMBER: PMEMD 3.1 Release - High Scalability Update to PMEMD

From: Robert Duke <rduke.email.unc.edu>
Date: Fri, 31 Oct 2003 10:40:37 -0500

We are proud to announce the release of version 3.1 (the first major
performance update) of PMEMD (Particle Mesh Ewald Molecular Dynamics).

PMEMD is a new version of the Amber module "Sander", and has been written
with the major goal of improving performance in Particle Mesh Ewald
molecular dynamics simulations and minimizations. The code has been totally
rewritten in Fortran 90, and is capable of running in either an Amber 6 or
Amber 7 mode. Functionality is more complete in Amber 6 mode, with the
Amber 7 mode designed mostly to do the same sorts of things that Amber 6
does, but with output that is comparable to Amber 7 Sander. The calculations
done in PMEMD are intended to replicate either Sander 6 or Sander 7
calculations within the limits of roundoff errors. The calculations are just
done more rapidly in about half the memory, and runs may be made efficiently
on significantly larger numbers of processors.

The primary site for high scalability work on PMEMD 3.1 has been the
Edinburgh Parallel Computing Centre (IBM P690 Regatta, 1.3 GHz Power4 CPU's,
1280 total processors), and we would like to thank EPCC for making their
facilities available for this work. At EPCC, we have obtained maximum
throughputs of 3.65 nsec/day (constant volume, 320 processors) and 3.48
nsec/day (constant pressure, 320 processors) for a 90906 atom PME solvated
protein simulation. This compares to 0.41 nsec/day (constant pressure, 128
processors) for Sander 7 on the same simulation problem and 3.43 nsec/day
(1024 processors) for NAMD on a similar simulation problem (92,224 atoms).
More significant is performance at the "50% scalability point", the point
where adding more processors will decrease compute efficiency below 50%.
PMEMD 3.1 runs the above simulation with at least 50% scalability on 128
processors, producing 2.85 nsec/day throughput. For Sander 7, only 16
processors may be used without going below 50% scalability, and throughput
is 0.28 nsec/day. For NAMD, 256 processors may be used at 50% scalability,
but throughput is only 1.3 nsec/day. Additional benchmark data is
presented in the Update Note available at the Amber website.

PMEMD was developed by Dr. Robert Duke in Prof. Lee Pedersen's Lab at
UNC-Chapel Hill, starting from the version of Sander in Amber 6. Funding
support was provided by NIH grant HL-06350 (PPG) and NSF grant 2001-0759-02
(ITR/AP). When citing PMEMD (Particle Mesh Ewald Molecular Dynamics) in the
literature, please use both the Amber Version 7 citation given in the Amber
7 manual, and the following citation:

Robert E. Duke and Lee G. Pedersen (2003) PMEMD 3.1, University of North
Carolina-Chapel Hill

PMEMD is available without charge to users who have an existing license for
Amber (version 6 or 7). For more information, and to download the code,
please go to:

                http://amber.scripps.edu/pmemd-get.html


- Robert Duke (UNC-Chapel Hill) and David Case (The Scripps Research
Institute)



-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber.scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo.scripps.edu
Received on Fri Oct 31 2003 - 15:53:01 PST
Custom Search