Thank you all for the replies. All the replies helped me understand
the parallel computing a lot.
Best regards,
Changwoo
On Fri, Dec 16, 2011 at 12:48 PM, Jason Swails <jason.swails.gmail.com> wrote:
> I think most detrimental bottleneck to parallel scaling in Amber is the
> interconnect between the nodes. The slower that interconnect is, the
> longer communications off-board take. For collective communications that
> involve communications between all threads, that interconnect becomes
> pretty clogged up pretty quickly. You effectively have every thread
> running Amber trying to use the same interface at the same time, so the
> total bandwidth is divided up between all of the threads on that node.
>
> Therefore, when you have fewer cores trying to use the same network card
> (such as 8 nodes x 4 cpu/node), the more bandwidth can be allocated to each
> thread. Also, the fewer threads you have running on a single node, the
> less each thread constricts the resources available to the others, which
> will also cause it to run faster. (i.e. memory bandwidth).
>
> HTH,
> Jason
>
> On Fri, Dec 16, 2011 at 11:58 AM, Changwoo Do <dokira.gmail.com> wrote:
>
>> Dear Amber teams,
>>
>> I have been using Amber 10 for polymer system MD simulations. One
>> thing I found recently is that the calculation speed has strange
>> dependency to the number of CPUs in one nodes.
>>
>> For the same MD simulation I tested
>>
>> 1) 8 nodes x 4 cpu/node
>> 2) 4 nodes x 8 cpu/node
>>
>> Case #1 was faster by about 30 %. Huge difference.
>>
>> Another test run was
>> 1) 3 nodes x 8 cpu/node
>> 2) 4 nodes x 6 cpu/node
>> 3) 6 nodes x 4 cpu/node
>>
>> #3 was fastest. Next one was #2 and #1 was the slowest.
>>
>> Can anyone suggest what this result indicates? Does this suggest that
>> the openMPI I'm using is not optimized? and not very efficient ? Or is
>> this normal?
>>
>> Thank you,
>>
>> Changwoo
>>
>> _______________________________________________
>> AMBER mailing list
>> AMBER.ambermd.org
>> http://lists.ambermd.org/mailman/listinfo/amber
>>
>
>
>
> --
> Jason M. Swails
> Quantum Theory Project,
> University of Florida
> Ph.D. Candidate
> 352-392-4032
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Dec 16 2011 - 13:00:02 PST