Re: [AMBER] Nvidia RTX 3080/3090 Servers

From: mish <smncbr.gmail.com>
Date: Fri, 22 Jan 2021 08:16:34 -0600

Thanks a million Ross.

On Fri, Jan 22, 2021, 08:04 Ross Walker <ross.rosswalker.co.uk> wrote:

> Hi Mish,
>
> It works fine as long as you are using a properly ducted server that is
> designed for 8 cards and you need to set the case fans to full speed. It
> will sound like a jet engine so you do not want to install this in an
> office. It also requires 2 x 220V 20AMP power feeds (4 if you want
> redundancy) so you can't run it off of a standard US office circuit but
> most data centers support this fine.
>
> I have tested this extensively with 8 x 3090 GPUs. I not tried with
> 10x3090s and I suspect that would not work even though there is space due
> to power constraints.
>
> You need to use the new Gigabyte 2 slot wide 3090s that also have the
> power connectors going into the back of the card just like the tesla cards.
> Pictures attached.
>
> The NVIDIA EULA argument has been debunked multiple times. Honestly I
> would just hang up on any vendor that spouts that nonsense since it just
> tells you they don't really know what they are talking about. If you are
> worried about the EULA just use the Tesla drivers anyway which don't have
> the line people to refer to in the the actual license agreement. E.g.
>
>
> https://us.download.nvidia.com/tesla/460.32.03/NVIDIA-Linux-x86_64-460.32.03.run
>
> And the license in question:
> https://us.download.nvidia.com/tesla/460.32.03/NVIDIA-Linux-x86_64-460.32.03.run
> - note section 2.1.3 Limitations does not contain the data center terms
> that are in the GeForce drivers.
>
> Hope that helps, I'll ping you directly with some suggestions of companies
> you can talk to.
>
> All the best
> Ross
>
>
>
> <
> https://us.download.nvidia.com/tesla/460.32.03/NVIDIA-Linux-x86_64-460.32.03.run
> >
> > On Jan 21, 2021, at 21:48, mish <smncbr.gmail.com> wrote:
> >
> > Dear all,
> >
>
> > Sorry for this little off the topic question.
> >
> > Has someone been using 8-10x Nvidia RTX 3080/3090 servers and if we can
> > get some inputs about heating issues? We are getting very different and
> > contradictory advice from a few vendors. We are just thinking of getting
> > one 8x GPU server that will be used within our team so that it
> > doesn't violate Nvidi's EULA. If you have any suggestion to buy a $30-35K
> > GPU server for running Amber, I will appreciate it.
> >
> > Best,
> > Mish
> > _______________________________________________
> > AMBER mailing list
> > AMBER.ambermd.org
> > http://lists.ambermd.org/mailman/listinfo/amber
>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Fri Jan 22 2021 - 06:30:05 PST
Custom Search