Kathleen:
Oh, this is a different story. I've not tried PMEMD in a parallel
environment. I did use Sander on both a cluster and at PSC (6 or 7, a
few years ago) and after about 8 CPUs, things did not improve (speed).
Pete
>>> rduke.email.unc.edu 09/22 12:57 PM >>>
Kathleen -
Don't know who you were talking to on this one. PMEMD 8, which offers
a subset of sander functionality (primarily pme explicit solvent calcs)
scales well on good hardware out to about 128 procs, and has been
competitive with namd on various platforms for top-end scaling.
Versions under current development scale well out past 128 procs,
depending once again on the h/w, though going over 200 procs is
impractical with the current parallelization infrastructure. There has
been an emphasis on getting good per processor performance, followed by
scaling well up to reasonable numbers of processors, to yield good
simulation througput. We can achieve on the order of 6-7 nsec/day
simulating large (~100K atoms) explicit solvent systems with pme
electrostatics; this is nothing to sneeze at. I don't actually know of
many places where folks can get 256, 512, 1024 procs for days on end
without obliterating their compute allocation.
Regards - Bob Duke
----- Original Message -----
From: Kathleen Erickson
To: amber.scripps.edu
Sent: Thursday, September 22, 2005 12:44 PM
Subject: Re: AMBER: amber8 parallel sander
Peter: So are you saying that scaling efficiencies beyond 8 CPUs are
better achieved with NAMD? In other words, is 8 CPUs the limit for most
effiencient scaling on AMBER? I'd like to know at which point AMBER
users find they need to switch to NAMD or some other MD package.
-Kathleen
-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber.scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo.scripps.edu
Received on Thu Sep 22 2005 - 22:53:00 PDT