Re: [AMBER] MMPBSA.MPI running issue

From: Ye Fan <yefan.ncsa.uiuc.edu>
Date: Tue, 19 Jul 2011 11:39:59 -0600

Hi Jason,

Thanks for the suggestion.

I have two failures for serial mmpbsa test before test process got suspended:
==================================================================
cd 06_NAB_Nmode && ./Run.nmode
diffing FINAL_RESULTS_MMPBSA.dat.save with FINAL_RESULTS_MMPBSA.dat
possible FAILURE: check FINAL_RESULTS_MMPBSA.dat.dif

cd 08_Stability && ./Run.stability
./Run.stability: Program error
make[1]: *** [STABILITY] Error 1
make[1]: Leaving directory `/gpfs1/u/ncsa/yefan/apps/amber11-gnu/AmberTools/test/mmpbsa_py'
make: *** [test.mmpbsa] Error 2
==================================================================

For the parallel version test:
I initially set DO_PARALLEL to "mpirun - np 4" which generated an error saying " threads number cannot be larger than frames number". So I set it to "mpirun -np 1".
==================================================================
cd 06_NAB_Nmode && ./Run.nmode
diffing FINAL_RESULTS_MMPBSA.dat.save with FINAL_RESULTS_MMPBSA.dat
possible FAILURE: check FINAL_RESULTS_MMPBSA.dat.dif

diffing FINAL_RESULTS_MMPBSA2.dat.save with FINAL_RESULTS_MMPBSA2.dat
possible FAILURE: check FINAL_RESULTS_MMPBSA2.dat.dif

diffing FINAL_DECOMP_MMPBSA2.dat.save with FINAL_DECOMP_MMPBSA2.dat
possible FAILURE: check FINAL_DECOMP_MMPBSA2.dat.dif

cd 11_3D-RISM && ./Run.rism3d
diffing FINAL_RESULTS_MMPBSA.dat.save with FINAL_RESULTS_MMPBSA.dat
possible FAILURE: check FINAL_RESULTS_MMPBSA.dat.dif
==================================================================

Regarding to the mpi4py, I did manually build it with openmpi-1.4.3 instead of mpt coming with the SGI system(which causes problems). It passed a piece of simple hello world mpi python code but I didn't do thorough tests for it.

Thanks
Ye


On Jul 15, 2011, at 4:20 PM, Jason Swails wrote:

> Do the MMPBSA tests pass in parallel on ember? It could be that mpi4py doesn't install properly on that system. There are a few Teragrid systems that's true for (i.e. Kraken) since the mpi4py build script expects a standard MPI installation (a functional mpicc). If this is not the case, you'll need to build mpi4py by hand before running MMPBSA.MPI.
>
> HTH,
> Jason
>
> --
> Jason M. Swails
> Quantum Theory Project,
> University of Florida
> Ph.D. Candidate
> 352-392-4032
>
> On Jul 15, 2011, at 2:27 PM, Ye Fan <yefan.ncsa.uiuc.edu> wrote:
>
>> Sure, I've pasted them here.
>>
>> I have _MMPBSA_ptraj1.out with _MMPBSA_ptraj1.out.[0-3]. _MMPBSA_ptraj2.out with _MMPBSA_ptraj2.out.[0-3] and _MMPBSA_ptraj3.out with _MMPBSA_ptraj3.out.[0-3]. There are also _MMPBSA_ptraj4.out, _MMPBSA_ptraj5.out and _MMPBSA_ptraj6.out.
>>
>> Well, I have no idea which trajectory approach I'm using since I'm actually helping one of our users to get his MMPBSA job run on Ember(SGI Altix UV System) at NCSA.
>>
>> Thanks
>> Ye
>>
>> _MMPBSA_ptraj1.out:
>> --------------------------------------------------------------------------------------------------------------------------------------------
>> \-/
>> -/- PTRAJ: a utility for processing trajectory files
>> /-\
>> \-/ Version: "AMBER 11.0 integrated" (4/2010)
>> -/- Executable is: "/u/ncsa/yefan/apps/amber11-gnu/bin/ptraj"
>> /-\ Running on 1 processor(s)
>> \-/ Residue labels:
>>
>> LYS ILE ALA ALA LEU LYS GLN LYS ILE ALA
>> SER LEU LYS GLN GLU ILE ASP ALA LEU GLU
>> TYR GLU ASN ASP ALA LEU GLU GLN LYS ILE
>> ALA ALA LEU LYS GLN LYS ILE ALA SER LEU
>> LYS GLN GLU ILE ASP ALA LEU GLU TYR GLU
>> ASN ASP ALA LEU GLU GLN LYS ILE ALA ALA
>> LEU LYS GLN LYS ILE ALA SER LEU LYS GLN
>> GLU ILE ASP ALA LEU GLU TYR GLU ASN ASP
>> ALA LEU GLU GLN LYS ILE ARG ALA LEU LYS
>> ALA LYS ASN ALA HIE LEU LYS GLN GLU ILE
>> ALA ALA LEU GLU GLN GLU ILE ALA ALA LEU
>> GLU GLN LYS ILE ARG ALA LEU LYS ALA LYS
>> ASN ALA HIE LEU LYS GLN GLU ILE ALA ALA
>> LEU GLU GLN GLU ILE ALA ALA LEU GLU GLN
>> LYS ILE ARG ALA LEU LYS ALA LYS ASN ALA
>> HIE LEU LYS GLN GLU ILE ALA ALA LEU GLU
>> GLN GLU ILE ALA ALA LEU GLU GLN Na+ Na+
>> Na+ WAT WAT WAT WAT WAT WAT WAT WAT WAT
>> WAT WAT WAT WAT WAT WAT WAT WAT WAT WAT
>> ...
>> WAT
>>
>>
>> PTRAJ: Processing input from file _MMPBSA_cenptraj.in
>>
>> PTRAJ: trajin g1hexa_prod_comb 5251 5300 1
>> Checking coordinates: g1hexa_prod_comb
>> Rank: 0 Atoms: 41418 FrameSize: 1006483 TitleSize: 30 NumBox: 3 Seekable 1
>>
>>
>> PTRAJ: strip :WAT:Cl-:CIO:Cs+:IB:K+:Li+:MG2:Na+:Rb+
>> Mask [:WAT:Cl-:CIO:Cs+:IB:K+:Li+:MG2:Na+:Rb+] represents 38673 atoms
>>
>> PTRAJ: center :1-56:223-278 mass origin
>> Mask [:1-56:223-278] represents 910 atoms
>>
>> PTRAJ: image origin center
>> Mask [*] represents 2745 atoms
>>
>> PTRAJ: center :1-168 mass origin
>> Mask [:1-168] represents 2745 atoms
>>
>> PTRAJ: image origin center
>> Mask [*] represents 2745 atoms
>>
>> PTRAJ: rms first mass :1-168
>> Mask [:1-168] represents 2745 atoms
>>
>> PTRAJ: average _MMPBSA_avgcomplex.pdb pdb
>> Mask [*] represents 2745 atoms
>>
>> PTRAJ: trajout _MMPBSA_complex.mdcrd nobox
>> g1hexa_prod_comb: 7420 frames.
>>
>> PTRAJ: Successfully read the input file.
>> Coordinate processing will occur on 50 frames.
>> Summary of I/O and actions follows:
>>
>> INPUT COORDINATE FILES
>> File (g1hexa_prod_comb) is an AMBER trajectory (with box info) with 5300 sets (processing only 50)
>>
>> OUTPUT COORDINATE FILE
>> File (_MMPBSA_complex.mdcrd) is an AMBER trajectory
>> ACTIONS
>> 1> STRIP: 38673 atoms will be removed from trajectory: :169-13061
>> 2> CENTER to origin via center of mass, atom selection follows :1-56
>> 3> IMAGE by molecule to origin using the center of mass, atom selection * (All atoms are selected)
>> 4> CENTER to origin via center of mass, atom selection follows * (All atoms are selected)
>> 5> IMAGE by molecule to origin using the center of mass, atom selection * (All atoms are selected)
>> 6> RMS to first frame using mass weighting
>> Atom selection follows * (All atoms are selected)
>> 7> AVERAGE: dumping the average of the coordinates to file _MMPBSA_avgcomplex.pdb
>> start: 1 Stop [at final frame] Offset: 1
>> Atom selection * (All atoms are selected)
>> Output file information: File (_MMPBSA_avgcomplex.pdb) is a PDB file
>>
>>
>> Processing AMBER trajectory file g1hexa_prod_comb
>>
>> ........................ 75% 100%
>>
>>
>> PTRAJ: Successfully read in 50 sets and processed 50 sets.
>>
>> Dumping accumulated results (if any)
>> --------------------------------------------------------------------------------------------------------------------------------------------
>>
>> _MMPBSA_ptraj2.out:
>> --------------------------------------------------------------------------------------------------------------------------------------------
>> \-/
>> -/- PTRAJ: a utility for processing trajectory files
>> /-\
>> \-/ Version: "AMBER 11.0 integrated" (4/2010)
>> -/- Executable is: "/u/ncsa/yefan/apps/amber11-gnu/bin/ptraj"
>> /-\ Running on 1 processor(s)
>> \-/ Residue labels:
>>
>> LYS ILE ALA ALA LEU LYS GLN LYS ILE ALA
>> SER LEU LYS GLN GLU ILE ASP ALA LEU GLU
>> TYR GLU ASN ASP ALA LEU GLU GLN LYS ILE
>> ALA ALA LEU LYS GLN LYS ILE ALA SER LEU
>> LYS GLN GLU ILE ASP ALA LEU GLU TYR GLU
>> ASN ASP ALA LEU GLU GLN LYS ILE ALA ALA
>> LEU LYS GLN LYS ILE ALA SER LEU LYS GLN
>> GLU ILE ASP ALA LEU GLU TYR GLU ASN ASP
>> ALA LEU GLU GLN LYS ILE ARG ALA LEU LYS
>> ALA LYS ASN ALA HIE LEU LYS GLN GLU ILE
>> ALA ALA LEU GLU GLN GLU ILE ALA ALA LEU
>> GLU GLN LYS ILE ARG ALA LEU LYS ALA LYS
>> ASN ALA HIE LEU LYS GLN GLU ILE ALA ALA
>> LEU GLU GLN GLU ILE ALA ALA LEU GLU GLN
>> LYS ILE ARG ALA LEU LYS ALA LYS ASN ALA
>> HIE LEU LYS GLN GLU ILE ALA ALA LEU GLU
>> GLN GLU ILE ALA ALA LEU GLU GLN
>>
>>
>> PTRAJ: Processing input from file _MMPBSA_ligtraj.in
>>
>> PTRAJ: trajin _MMPBSA_complex.mdcrd
>> Checking coordinates: _MMPBSA_complex.mdcrd
>> Rank: 0 Atoms: 2745 FrameSize: 66704 TitleSize: 30 NumBox: 0 Seekable 1
>>
>>
>> PTRAJ: strip :1-56:223-278
>> Mask [:1-56:223-278] represents 910 atoms
>>
>> PTRAJ: trajout _MMPBSA_ligand.mdcrd nobox
>> _MMPBSA_complex.mdcrd: 50 frames.
>>
>> PTRAJ: Successfully read the input file.
>> Coordinate processing will occur on 50 frames.
>> Summary of I/O and actions follows:
>>
>> INPUT COORDINATE FILES
>> File (_MMPBSA_complex.mdcrd) is an AMBER trajectory with 50 sets
>>
>> OUTPUT COORDINATE FILE
>> File (_MMPBSA_ligand.mdcrd) is an AMBER trajectory
>> ACTIONS
>> 1> STRIP: 910 atoms will be removed from trajectory: :1-56
>>
>>
>> Processing AMBER trajectory file _MMPBSA_complex.mdcrd
>>
>> ........................ 75% 100%
>>
>>
>> PTRAJ: Successfully read in 50 sets and processed 50 sets.
>>
>> Dumping accumulated results (if any)
>> --------------------------------------------------------------------------------------------------------------------------------------------
>>
>> _MMPBSA_ptraj3.out:
>> --------------------------------------------------------------------------------------------------------------------------------------------
>> \-/
>> -/- PTRAJ: a utility for processing trajectory files
>> /-\
>> \-/ Version: "AMBER 11.0 integrated" (4/2010)
>> -/- Executable is: "/u/ncsa/yefan/apps/amber11-gnu/bin/ptraj"
>> /-\ Running on 1 processor(s)
>> \-/ Residue labels:
>>
>> LYS ILE ALA ALA LEU LYS GLN LYS ILE ALA
>> SER LEU LYS GLN GLU ILE ASP ALA LEU GLU
>> TYR GLU ASN ASP ALA LEU GLU GLN LYS ILE
>> ALA ALA LEU LYS GLN LYS ILE ALA SER LEU
>> LYS GLN GLU ILE ASP ALA LEU GLU TYR GLU
>> ASN ASP ALA LEU GLU GLN LYS ILE ALA ALA
>> LEU LYS GLN LYS ILE ALA SER LEU LYS GLN
>> GLU ILE ASP ALA LEU GLU TYR GLU ASN ASP
>> ALA LEU GLU GLN LYS ILE ARG ALA LEU LYS
>> ALA LYS ASN ALA HIE LEU LYS GLN GLU ILE
>> ALA ALA LEU GLU GLN GLU ILE ALA ALA LEU
>> GLU GLN LYS ILE ARG ALA LEU LYS ALA LYS
>> ASN ALA HIE LEU LYS GLN GLU ILE ALA ALA
>> LEU GLU GLN GLU ILE ALA ALA LEU GLU GLN
>> LYS ILE ARG ALA LEU LYS ALA LYS ASN ALA
>> HIE LEU LYS GLN GLU ILE ALA ALA LEU GLU
>> GLN GLU ILE ALA ALA LEU GLU GLN
>>
>>
>> PTRAJ: Processing input from file _MMPBSA_rectraj.in
>>
>> PTRAJ: trajin _MMPBSA_complex.mdcrd
>> Checking coordinates: _MMPBSA_complex.mdcrd
>> Rank: 0 Atoms: 2745 FrameSize: 66704 TitleSize: 30 NumBox: 0 Seekable 1
>>
>>
>> PTRAJ: strip :57-84:279-306
>> Mask [:57-84:279-306] represents 455 atoms
>>
>> PTRAJ: trajout _MMPBSA_receptor.mdcrd nobox
>> _MMPBSA_complex.mdcrd: 50 frames.
>>
>> PTRAJ: Successfully read the input file.
>> Coordinate processing will occur on 50 frames.
>> Summary of I/O and actions follows:
>>
>> INPUT COORDINATE FILES
>> File (_MMPBSA_complex.mdcrd) is an AMBER trajectory with 50 sets
>>
>> OUTPUT COORDINATE FILE
>> File (_MMPBSA_receptor.mdcrd) is an AMBER trajectory
>> ACTIONS
>> 1> STRIP: 455 atoms will be removed from trajectory: :57-84
>>
>>
>> Processing AMBER trajectory file _MMPBSA_complex.mdcrd
>>
>> ........................ 75% 100%
>>
>>
>> PTRAJ: Successfully read in 50 sets and processed 50 sets.
>>
>> Dumping accumulated results (if any)
>> --------------------------------------------------------------------------------------------------------------------------------------------
>>
>> On Jul 15, 2011, at 11:47 AM, Dwight McGee wrote:
>>
>>> Hi,
>>>
>>> Can you attach or paste the following files _MMPBSA_ptraj1.out,
>>> _MMPBSA_ptraj2.out, _MMPBSA_ptraj3.out. Are you using a single trajectory or
>>> triple trajectory approach?
>>>
>>> On Fri, Jul 15, 2011 at 12:33 PM, Ye Fan <yefan.ncsa.uiuc.edu> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am still trying to debug the issue with MMPBSA.MPI. I could run the same
>>>> input files through MMPBSA successfully but always failed with parallel
>>>> version MMPBSA.MPI.
>>>>
>>>> Now, I have ruled out the previous suspicion that IOError was causing the
>>>> crash of calculation. The real reason is that a segmentation fault exception
>>>> was thrown during "calculating ligand contribution..." for GB calculation.
>>>> To be more specific: "mmpbsa_py_energy -O -i _MMPBSA_gb.mdin -o
>>>> _MMPBSA_ligand_gb.mdout.0 -p g1dimer_4hexa_nosolv.top -c
>>>> _MMPBSA_dummyligand.inpcrd.1 -y _MMPBSA_ligand.mdcrd.0 -r
>>>> _MMPBSA_.restrt_ligand.0 -pdb _MMPBSA_ligand.pdb" gives segmentation fault.
>>>> There is only 1 frame in _MMPBSA_ligand_gb.mdout.0. The same thing happens
>>>> to the other 3 processes.
>>>>
>>>> Another odd thing I've noticed was that "50 frames were read in and
>>>> processed by ptraj for use in calculation" was always showing up at
>>>> beginning of calculation for both parallel and serial run.
>>>>
>>>> For parallel run:
>>>> I checked
>>>> _MMPBSA_complex_gb.mdout.0 & _MMPBSA_receptor_gb.mdout.0 13
>>>> frames processed
>>>> _MMPBSA_complex_gb.mdout.1 & _MMPBSA_receptor_gb.mdout.1 13
>>>> frames processed
>>>> _MMPBSA_complex_gb.mdout.2 & _MMPBSA_receptor_gb.mdout.2 12
>>>> frames processed
>>>> _MMPBSA_complex_gb.mdout.3 & _MMPBSA_receptor_gb.mdout.3 12
>>>> frames processed
>>>> 13+13+12+12=50 and it's fine
>>>>
>>>> _MMPBSA_ligand_gb.mdout.0 1 frame processed then crashed
>>>> _MMPBSA_ligand_gb.mdout.1 1 frame processed then crashed
>>>> _MMPBSA_ligand_gb.mdout.2 0 frame processed then crashed
>>>> _MMPBSA_ligand_gb.mdout.3 1 frame processed then crashed
>>>>
>>>> For serial run:
>>>> I checked
>>>> _MMPBSA_complex_gb.mdout 50 frames processed
>>>> _MMPBSA_receptor_gb.mdout 62 frames processed
>>>> _MMPBSA_ligand_gb.mdout 100 frames processed
>>>>
>>>> Could anyone give me some explanation?
>>>>
>>>> Thanks
>>>> Ye
>>>>
>>>>
>>>> On Jul 6, 2011, at 9:51 AM, Ye Fan wrote:
>>>>
>>>>> Hi Bill,
>>>>>
>>>>> Yes, I've downloaded the bugfix.all at "
>>>> http://ambermd.org/bugfixesat.html" for version 1.5 and patched the source
>>>> codes before compiling them. If you could tell me which bug particularly you
>>>> are looking for, I can manually check it again.
>>>>>
>>>>> Thanks
>>>>> Ye
>>>>>
>>>>> On Jul 5, 2011, at 5:14 PM, Bill Miller III wrote:
>>>>>
>>>>>> Have you applied all the AmberTools 1.5 bugfixes? This was a bug that I
>>>>>> found in the code a while back, and I thought it had been added to the
>>>> known
>>>>>> patches.
>>>>>>
>>>>>> -Bill
>>>>>>
>>>>>> On Tue, Jul 5, 2011 at 3:53 PM, Ye Fan <yefan.ncsa.uiuc.edu> wrote:
>>>>>>
>>>>>>> Hi Bill,
>>>>>>>
>>>>>>> This is the last part of _MMPBSA_receptor_pb.mdout file:
>>>>>>>
>>>>>>>
>>>>>>>
>>>> --------------------------------------------------------------------------------
>>>>>>> 4. RESULTS
>>>>>>>
>>>>>>>
>>>> --------------------------------------------------------------------------------
>>>>>>>
>>>>>>> POST-PROCESSING OF TRAJECTORY ENERGIES
>>>>>>> trajectory generated by ptraj
>>>>>>> minimizing coord set # 1
>>>>>>> Total surface charge 1.9723
>>>>>>> Reaction field energy -3570.1196
>>>>>>> Cavity solvation energy 54.6777
>>>>>>>
>>>>>>>
>>>>>>> Maximum number of minimization cycles reached.
>>>>>>>
>>>>>>>
>>>>>>> FINAL RESULTS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> NSTEP ENERGY RMS GMAX NAME NUMBER
>>>>>>> 1 -3.1384E+03 1.7292E+01 1.0289E+02 N 680
>>>>>>>
>>>>>>> BOND = 329.2644 ANGLE = 939.0290 DIHED =
>>>>>>> 1182.0229
>>>>>>> VDWAALS = -670.5602 EEL = -7995.5664 EPB =
>>>>>>> -3570.1813
>>>>>>> 1-4 VDW = 394.1857 1-4 EEL = 6198.7732 RESTRAINT =
>>>>>>> 0.0000
>>>>>>> ECAVITY = 54.6777 EDISPER = 0.0000
>>>>>>> minimization completed, ENE= -.31383551E+04 RMS= 0.172922E+02
>>>>>>> minimizing coord set # 2
>>>>>>> 67.611999999999995 5.7789999999999999 0.63300000000000001
>>>>>>> pb_fdfrc(): Atom out of focusing box 251 59
>>>> 44
>>>>>>>
>>>>>>>
>>>> ------------------------------------------------------------------------------------------------------------------------
>>>>>>>
>>>>>>> I hope it helps.
>>>>>>>
>>>>>>> Thanks
>>>>>>> Ye
>>>>>>>
>>>>>>> On Jul 5, 2011, at 11:57 AM, Bill Miller III wrote:
>>>>>>>
>>>>>>>> What is at the end of the _MMPBSA_receptor_pb.mdout file? There should
>>>> be
>>>>>>> an
>>>>>>>> error message or warning in that file that should help explain the
>>>> error
>>>>>>>> based on where the calculation ended.
>>>>>>>>
>>>>>>>> -Bill
>>>>>>>>
>>>>>>>> On Tue, Jul 5, 2011 at 12:50 PM, Ye Fan <yefan.ncsa.uiuc.edu> wrote:
>>>>>>>>
>>>>>>>>> Hi Jason,
>>>>>>>>>
>>>>>>>>> I have added the "use_sander=1" in the $general section of input
>>>> file.
>>>>>>>>> However, the computation failed with sander error:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>
>>>> ================================================================================
>>>>>>>>> Reading command-line arguments and input files...
>>>>>>>>> Loading and checking parameter files for compatibility...
>>>>>>>>> ptraj found! Using /u/ncsa/yefan/apps/amber11-gnu/bin/ptraj
>>>>>>>>> sander found! Using /u/ncsa/yefan/apps/amber11-gnu/bin/sander for GB
>>>>>>>>> calculations
>>>>>>>>> sander found! Using /u/ncsa/yefan/apps/amber11-gnu/bin/sander for PB
>>>>>>>>> calculations
>>>>>>>>> Preparing trajectories for simulation...
>>>>>>>>> 50 frames were read in and processed by ptraj for use in calculation.
>>>>>>>>>
>>>>>>>>> Beginning GB calculations with sander...
>>>>>>>>> calculating complex contribution...
>>>>>>>>> calculating receptor contribution...
>>>>>>>>> calculating ligand contribution...
>>>>>>>>>
>>>>>>>>> Beginning PB calculations with sander...
>>>>>>>>> calculating complex contribution...
>>>>>>>>> calculating receptor contribution...
>>>>>>>>> Error: sander error during PB calculations!
>>>>>>>>> NOTE: All files have been retained for debugging purposes. Type
>>>>>>> MMPBSA.py
>>>>>>>>> --clean to erase these files.
>>>>>>>>>
>>>>>>>>>
>>>>>>>
>>>> ================================================================================
>>>>>>>>>
>>>>>>>>> I have serial sander built with gnu compiler.
>>>>>>>>>
>>>>>>>>> So, I went into the test folder of Amber11 and did `make
>>>> test.serial.MM
>>>>>>> `.
>>>>>>>>> It failed at (I wanted to test sander built):
>>>>>>>>> ========================================
>>>>>>>>> CALCULATING TEST: 02_MMPBSA_Stability
>>>>>>>>> ./Run.mmpbsa.test: Program error
>>>>>>>>> make: *** [test.mm_pbsa] Error 1
>>>>>>>>> ========================================
>>>>>>>>>
>>>>>>>>> I also did test under $AMBERHOME/AmberTools/test/mmpbsa_py.
>>>>>>>>>
>>>>>>>>> It failed at:
>>>>>>>>> ===========================
>>>>>>>>> cd 06_NAB_Nmode && ./Run.nmode
>>>>>>>>> diffing FINAL_RESULTS_MMPBSA.dat.save wit
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>


_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue Jul 19 2011 - 11:00:04 PDT
Custom Search