Re: [AMBER] Fluctuation in GPU usage

From: Ross Walker <>
Date: Tue, 15 Jun 2021 13:52:10 -0400

Hi Aravind,

GPUs are not designed to be shared and more to the point unlike AMBER, Gromacs makes heavy usage of all the resources on a node including all CPU cores and will also heavily load the PCI-E bus due to how much information it shuttles back and forth between CPU and GPU. As such I would highly recommend never trying to run Gromacs on a shared node. It can be done but you need to properly configure the queuing system and cgroups to properly isolate jobs to only the resources the queuing system provided.

Simple solution is don't run Gromacs at the same time as other calculations and never have two jobs sharing a GPU. You use the CUDA_VISIBLE_DEVICES environment variable to control which GPUs jobs run on but even then Gromacs will load the CPU and PCI bus up so much that it still will likely impact other jobs running on the same node.

All the best

> On Jun 15, 2021, at 13:34, Aravind R <> wrote:
> Dear Amber Experts,
> I am using Amber(pmemd.cuda) in a cluster with 64 nodes, each containing 2
> Nvidia Tesla K40m (not peer-to-peer connected). This is a shared cluster
> across the institute. When amber runs in a single GPU with single core
> separately it utilises 99% of GPU load and runs properly(attached
> Only_amber.png).
> But when the gromacs job is submitted to the same node, the GPU utilisation
> decreases drastically (attached Amber_&_gromacs.png) and thereby the speed
> at which the job runs (attached Speed.png). Whereas gromacs jobs utilise
> the GPU completely.
> The script I use to submit the job attached herewith(Input_script.png)
> along with top command results(Top_results.png).
> Also, scheduler allows gromacs to run on the GPU when amber is already
> running whereas not the vice-versa.
> What is the reason for the decrease in GPU utilisation?
> Thanks in advance,
> Aravind R
> <Amber___gromacs.png><Input_script.png><Speed.png><Only_amber.png><Top_results.png>_______________________________________________
> AMBER mailing list

AMBER mailing list
Received on Tue Jun 15 2021 - 11:00:03 PDT
Custom Search