Re: [AMBER] CUDA and driver version for Nvidia Titan Black and Titan Z GPUs cluster

From: Ross Walker <ross.rosswalker.co.uk>
Date: Mon, 10 Sep 2018 18:12:49 -0400

Agreed modules is the way to go but if you don't have the time or expertise to setup a modules environment you can conceivably run AMBER 14 and AMBER 18 side by side by doing the following:

1) Install latest NVIDIA Driver

2) Install CUDA 7.5, needed for AMBER 14

3) Install CUDA 9.2, recommended for AMBER 18

4) Set your LD_LIBRARY_PATH as follows - assuming the CUDA was installed in /usr/local

export CUDA_HOME=/usr/local/cuda
export CUDA75_HOME=/usr/local/cuda-7.5
export LD_LIBRARY_PATH=$CUDA_HOME/lib:$CUDA_HOME/lib64:$CUDA75_HOME/lib:$CUDA75_HOME/lib64:$LD_LIBRARY_PATH

5) Point the /usr/local/cuda link to /usr/local/cuda-7.5

rm -f /usr/local/cuda
ln -s /usr/local/cuda-7.5 /usr/local/cuda

6) Compile AMBER 14

7) Point the /usr/local/cuda link to /usr/local/cuda-9.2

rm -f /usr/local/cuda
ln -s /usr/local/cuda-9.2 /usr/local/cuda

8) Compile AMBER 18

You should then be able to use both AMBER 18 and AMBER 14 - setting AMBERHOME correctly each time even though your default cuda is now 9.2.

Note this is way easier to do if you have CUDA defined as modules since then you just load the right cuda module before compiling / running AMBER. If you build an AMBER module which has dependencies on the right cuda module then it's even easier.

Ultimately though why does one need both AMBER 14 and AMBER 18 available? I'd suggest just sticking with AMBER 18 for everything. The only reason for needing AMBER 14 is if you want to run on GPUs that are no longer supported by AMBER 18 - C2070 for example - ultimately though at this point the power efficiency of those older GPUs is such that you are probably better off simply not using them.

All the best
Ross


> On Sep 10, 2018, at 4:33 PM, Ryan Novosielski <novosirj.rutgers.edu> wrote:
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> On 09/10/2018 11:20 AM, David A Case wrote:
>> On Fri, Sep 07, 2018, Karolina MitusiƄska (Markowska) wrote:
>>>
>>> I would like to ask you about cluster configuration. We are using
>>> Amber14 on our Nvidia Titan Black and Titan Z clusters. We are
>>> now running on CUDA 6.5 and 343.19 driver. Because we are also
>>> running on Ubuntu 14.04 and it's a bit old fashioned now, we want
>>> to upgrade to Ubuntu 18.04 and also (for other reasons) to
>>> Amber18.
>>>
>>> Can you suggest the driver and CUDA version that will suit us?
>>> And will that configuration run smoothly with Amber14 also?
>>
>> It may (or may not) be involved to have a single configuration that
>> works well with both Amber14 and Amber18. The GPU code in Amber18
>> is *much* better than that in Amber14; are you sure you need the
>> latter?
>>
>> Amber18 will work fine with Ubunutu 18.04, using gcc7 and CUDA
>> version 9.2, and current NVIDIA drivers (not sure what the number
>> is).
>>
>> To support Amber14 (roughly): use the "alternatives" apt-get
>> capability to install whatever compilers you currently have on
>> Ubuntu 14.04. Then, you can try to use those compilers to compile
>> Amber14's pmemd.cuda, and see if that executable will work with the
>> current CUDA and NVIDIA drivers. If it works, (and, as far as I
>> can tell, there is a good chance it will) you are set to go.
>>
>> (You *could* try compiling Amber14 pmemd.cuda with gcc7/gfortran7,
>> perhaps fixing things that are broken.)
>>
>> If not, I don't know how to proceed without having a dual-boot or
>> pair of virtual machines, one with your current configuration and
>> one with Ubuntu18.04. Maybe someone on the list has a better
>> idea.
>
> This is the sort of situation we handle via modules; currently, I
> think most people would recommend Lmod, but there's the Tcl based
> environment package as well. You can have one copy compiled with one
> and one with the other. I think the real thing to look out for is that
> you have a driver version that will work with both. So far, I've not
> had a problem with backward compatibility and NVIDIA drivers (we use
> M2070 cards all the way up to P100 cards with the same driver -- I've
> not tried very old CUDA with it though).
>
> - --
> ____
> || \\UTGERS, |----------------------*O*------------------------
> ||_// the State | Ryan Novosielski - novosirj.rutgers.edu
> || \\ University | Sr. Technologist - 973/972.0922 ~*~ RBHS Campus
> || \\ of NJ | Office of Advanced Res. Comp. - MSB C630, Newark
> `'
> -----BEGIN PGP SIGNATURE-----
>
> iEYEARECAAYFAluW1QUACgkQmb+gadEcsb4EJQCgh8V/OvZ9X1dEULwHPGQss6ro
> hvQAnA45podw42HosOPhBAR9k9bRN5nc
> =AM3O
> -----END PGP SIGNATURE-----
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber


_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Sep 10 2018 - 15:30:01 PDT
Custom Search