[AMBER] cluster configuration help

From: Sangeetha B <sangeetha.bicpu.edu.in>
Date: Fri, 9 Mar 2012 22:46:42 +0530

Dear Amber users,

I have been running simulations using Amber 11 on a GPU Tesla C2050
since a year. I would be running simulations with systems as large as
1,00,000 atoms.

Recently, there is a need to set up new computational facility with a
cluster.

I have received the following configuration in my quotation which also
suits my budget and since I am new to cluster and Sander I have some
queries.

1. If I set up a cluster with the following configurations, what would
be the performance for a parallel run using Sander?

2. How many jobs can I run in parallel at a time without affecting the
performance?

Is there anything else that I should take care while installing a cluster?

It will be helpful if I get any suggestions regarding this as soon as
possible...


Configuration received in the quotation:

1. Master Node (Qty 1)
Processor(s) 2 x Intel® Xeon® X5650 processors (6-Core/2.66GHz/12M L3 Cache)
RAM 48GB DDR3-1333 ECC RDIMM (Max. 192GB supported, 12 DIMMs)
RAID SAS 6Gbps RAID controller with 512MB cache, supports RAID 0, 1, 10, 5
& 6
HDD(s) 3 x 1000GB, 7200 RPM, SAS 6Gbps, hot-plug HDDs (Max 8 HDDs)
Optical Slim DVD-ROM Drive, (Internal)
NIC Dual Gigabit (10/1000/1000Mbps) Ethernet, with 1 x Cat6 patch cable
Infiniband 4X QDR (20Gbps) InfiniBand, Single Port(QSFP), with suitable
cable
Graphics Matrox G200eW controller with 8MB
Management IPMI 2.0 compliant with dedicated port. Supports KVM over LAN
Exp. Slots 2 x PCI-Express x8
Ports 1 Serial, 2 USB, 2 x Network, 1 x InfiniBand, 1 x Management, 1 x
Video,
Chassis 2U rack-mountable with sliding rails
Pwr Supply 1+1 redundant, hot-plug power supplies. 80Plus Platinum certified


2. Computer Node (Qty 2)
Processor 2 x Intel® Xeon® X5650 processors (6-Core/2.66GHz/12M L3 Cache)
RAM 48GB DDR3-1333 ECC RDIMM (Max. 192GB supported, 12 DIMMs)
HDD(s) 1 x 500GB, 7200 RPM, Enterprise SATA HDD.
NIC Dual Gigabit (10/1000/1000Mbps) Ethernet, with 1 x Cat6 patch cable
Infiniband 4X QDR (20Gbps) InfiniBand, Single Port(QSFP), with suitable
cable
Graphics Matrox G200eW controller with 8MB
Management IPMI 2.0 compliant with dedicated port. Supports KVM over LAN
Exp. Slots 2 x PCI-Express x8 (one used, one free)
Ports 1 Serial, 2 USB, 2 x Network, 1 x InfiniBand, 1 x Management, 1 x
Video
Chassis 1U rack-mountable with sliding rails


-- 
Thank you,
With regards,
B. Sangeetha
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Mar 09 2012 - 09:30:02 PST
Custom Search