On Fri, Mar 20, 2015 at 3:03 PM, Edwin Helbert Aponte Angarita <
helbert2a.gmail.com> wrote:
> Hello, thank you for your help.
>
> I will try without counter ions only then, and see if I can get the
> information I want this way.
>
> I will follow the advice from Carlos and use igb=8 with mbondi3, with
> ff14SB.
>
> I am using sander.MPI and I am running in this way:
>
> mpirun -np X sander.MPI -O -i mdin -o mdout -p prmtop -c inpcrd -r rst -x
> mdcrd
>
> where X has been either 8 or 16.
>
> I am new to amber so I haven't tried pmemd, but I will do it soon.
>
> May I be limited to 8 processors also in MD?
>
> I am also simulating the same peptide in TIP3PBOX solvent. So far I haven't
> had problems running it on 16 processors. Could I encounter the same
> problems with this simulation with explicit solvent?
>
No, explicit solvent has a lot more atoms, and PME and GB simulations
parallelize differently. Note the limitation here is probably because
there is not enough work in your 34 residue system to distribute across 16
processors. And even if there was, each processor would likely get so
little work that they would spend most of their time communicating with
each other rather than computing energies and forces.
Before you run simulations, it is always a good idea to run short tests
with different numbers of processors and compare their efficiencies. This
will help you make the best possible use of your resources (for instance,
by not assigning 32 processors to a job that runs just as fast, if not
faster, with only 16).
HTH,
Jason
--
Jason M. Swails
BioMaPS,
Rutgers University
Postdoctoral Researcher
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Mar 20 2015 - 13:00:02 PDT