Re: [AMBER] Provisional RTX2080 AMBER 18 Performance Numbers

From: Marek Maly <marek.maly.ujep.cz>
Date: Mon, 08 Oct 2018 20:44:13 +0200

Hi David,

thank you for web page update request, RTX 2080 is already there,
but "RTX 2080 Ti" is still missing, why ?

Because up to now just "RTX 2080" was successfully tested which
  does not guarantee, in your opinion, that the "RTX 2080 Ti" will also be
OK ?

    Best wishes

     Marek




Dne Mon, 08 Oct 2018 13:28:08 +0200 David Cerutti <dscerutti.gmail.com>
napsal/-a:

> I've queued the update for the web pages.
>
> DSC
>
>
> On Mon, Oct 8, 2018 at 6:15 AM Marek Maly <marek.maly.ujep.cz> wrote:
>
>> Hi Ross,
>>
>> thanks for info !
>>
>> 1) So in your opinion two "RTX2080 Ti" in 2U or 4U server should be OK
>> with regarding
>> to cooling, am I right ? Probably also in this case (just 2
>> GPUs/server) will be necessary (suitable) at least one empty PCI-E slot
>> between the GPUs, am I right ?
>>
>>
>> 2) Could be possible to add "RTX2080" and "RTX280 Ti" GPUs in the list
>> of
>> supported
>> Amber18 GPUs - e.g. here http://ambermd.org/GPUHardware.php - soon
>> ?
>> We are using this www page in our HW specification, when buying some
>> new CPU/GPU nodes.
>>
>> Thanks,
>>
>> Best wishes,
>>
>> Marek
>>
>>
>>
>>
>>
>> Dne Fri, 05 Oct 2018 15:39:30 +0200 Ross Walker <ross.rosswalker.co.uk>
>> napsal/-a:
>>
>> > Hi Marek,
>> >
>> > My expectation for 2080TI is ~25 to 30% over 1080TI. I should have
>> some
>> > firm numbers in about 2 to 3 weeks. Note the cooling design is the
>> same
>> > so the founders/reference design cards will have the same issues with
>> > multiple cards in a box as the RTX2080 and will need custom cooling
>> > solutions.
>> >
>> > In terms of performance per dollar things do not look good due to
>> > NVIDIA's price inflation. You are looking at a 70+% increase in price
>> > for a ~30% increase in performance, an increase one historically has
>> > always got for free with each new generation of hardware so I would
>> not
>> > exactly call the RTX2080TI a technological improvement over the
>> 1080TI.
>> > In real terms it is a step backwards by around 4 years.
>> >
>> > All the best
>> > Ross
>> >
>> >> On Oct 5, 2018, at 09:17, Marek Maly <marek.maly.ujep.cz> wrote:
>> >>
>> >> Hi Ross,
>> >>
>> >> thanks a lot for this first RTX2080/Amber18 tests and important
>> >> technical comments.
>> >>
>> >> I think that your "1080Ti vs 2080/Amber18" results are rather in good
>> >> agreement
>> >> with comparison of these two GPUs on
>> >>
>> >> https://www.videocardbenchmark.net/high_end_gpus.html
>> >>
>> >> But what I really look forward to (also because we are planning some
>> >> HW updates ...) are Amber18 benchmarks with "RTX 2080 Ti", where
>> seems
>> >> to be higher difference comparing to 1080 Ti and also lets hope that
>> >> in this actual "GTX Top model" will be also cooling a bit more
>> >> satisfying...
>> >>
>> >> When do you think could be "2080 Ti/Amber18" provisional benchmarks
>> >> available ?
>> >>
>> >> Best wishes,
>> >>
>> >> Marek
>> >>
>> >>
>> >>
>> >>
>> >>
>> >> Dne Fri, 05 Oct 2018 14:32:07 +0200 Ross Walker
>> <ross.rosswalker.co.uk>
>>
>> >> napsal/-a:
>> >>
>> >>> TLDNR: NVIDIA RTX2080 works with AMBER 18, gives the correct answers
>> >>> in provisional tests and gets performance equivalent to a 1080TI as
>> >>> long as you don't put more than 2 in a box without some kind of
>> custom
>> >>> cooling solution. NVIDIA price inflation is alive and kicking so
>> perf
>> >>> per dollar is down ~15%.
>> >>>
>> >>>
>> >>> Dear Amberites
>> >>>
>> >>> I have finally managed to get my hands on some reference design
>> >>> RTX2080 GPUs (another month or so for 2080TI) and had a chance to
>> test
>> >>> them with AMBER 18. First impressions are that the reference design
>> >>> cooler for the RTX series is crap.
>> >>>
>> >>>
>> >>>
>> >>> Unless there is a big space (at least 1 PCI-E unit, but ideally 2)
>> >>> between cards then the cards massively overheat, even when running
>> >>> their fans at 100% which causes them to significantly downclock. The
>> >>> following is an example with 4 cards running at once:
>> >>>
>> >>> Thu Oct 4 21:08:08 2018
>> >>>
>> +-----------------------------------------------------------------------------+
>> >>> | NVIDIA-SMI 410.57 Driver Version:
>> >>> 410.57 |
>> >>>
>> |-------------------------------+----------------------+----------------------+
>> >>> | GPU Name Persistence-M| Bus-Id Disp.A | Volatile
>> >>> Uncorr. ECC |
>> >>> | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util
>> >>> Compute M. |
>> >>>
>> |===============================+======================+======================|
>> >>> | 0 GeForce RTX 2080 On | 00000000:19:00.0 Off
>> >>> | N/A |
>> >>> | 42% 76C P2 178W / 215W | 281MiB / 7952MiB | 95%
>> >>> Default |
>> >>>
>> +-------------------------------+----------------------+----------------------+
>> >>> | 1 GeForce RTX 2080 On | 00000000:1A:00.0 Off
>> >>> | N/A |
>> >>> | 90% 87C P2 112W / 215W | 281MiB / 7952MiB | 96%
>> >>> Default |
>> >>>
>> +-------------------------------+----------------------+----------------------+
>> >>> | 2 GeForce RTX 2080 On | 00000000:67:00.0 Off
>> >>> | N/A |
>> >>> | 93% 87C P2 100W / 215W | 281MiB / 7952MiB | 96%
>> >>> Default |
>> >>>
>> +-------------------------------+----------------------+----------------------+
>> >>> | 3 GeForce RTX 2080 On | 00000000:68:00.0 Off
>> >>> | N/A |
>> >>> |100% 87C P2 74W / 215W | 281MiB / 7951MiB | 97%
>> >>> Default |
>> >>>
>> +-------------------------------+----------------------+----------------------+
>> >>> Note the top card runs well maintaining 76C at 178W with the fan at
>> >>> just 42%. The remaining cards are close to 100% fan speed, thermal
>> >>> limited at 87C and only drawing ~100W. That means they have clocked
>> >>> down significantly and are still overheating. I am working with
>> Exxact
>> >>> to engineer a solution and I am confident we can get these working
>> in
>> >>> 4 and 8xGPU configs but for the time being if you are building your
>> >>> own stock machines do not put more than 2 of these in a box and make
>> >>> sure you space them out. PNY are going to make more traditional
>> blower
>> >>> design versions of the RTX GPUs which hopefully will not have this
>> >>> cooling issue. I should have a chance to test some of those in a few
>> >>> weeks.
>> >>>
>> >>> The good news is the AMBER 18 test cases, and the validation suites
>> I
>> >>> have, all pass. Performance of the RTX2080 is on par (as long as you
>> >>> have the cards spaced out or have an auxiliary cooling solution),
>> with
>> >>> the 1080TI, which is what history, and ratio'ing the flop counts,
>> >>> would have us expect.
>> >>>
>> >>> Note these are provisional numbers for the 2080. Performance may
>> >>> improve some once optimizations for the SM7.5 hardware have been
>> made
>> >>> although I wouldn't expect any miracles.
>> >>>
>> >>> JAC_PRODUCTION_NVE - 23,558 atoms PME 4fs
>> >>> -----------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 761.01 seconds/ns =
>>
>> >>> 113.53
>> >>> 1080TI 1 x GPU: | ns/day = 776.83 seconds/ns =
>> >>> 111.22
>> >>>
>> >>> JAC_PRODUCTION_NPT - 23,558 atoms PME 4fs
>> >>> -----------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 713.93 seconds/ns =
>> >>> 121.02
>> >>> 1080TI 1 x GPU: | ns/day = 733.55 seconds/ns =
>> >>> 117.78
>> >>>
>> >>> JAC_PRODUCTION_NVE - 23,558 atoms PME 2fs
>> >>> -----------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 399.97 seconds/ns =
>> >>> 216.02
>> >>> 1080TI 1 x GPU: | ns/day = 409.26 seconds/ns =
>> >>> 211.11
>> >>>
>> >>> JAC_PRODUCTION_NPT - 23,558 atoms PME 2fs
>> >>> -----------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 367.69 seconds/ns =
>> >>> 234.98
>> >>> 1080TI 1 x GPU: | ns/day = 377.10 seconds/ns =
>> >>> 229.12
>> >>>
>> >>> FACTOR_IX_PRODUCTION_NVE - 90,906 atoms PME
>> >>> -------------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 130.48 seconds/ns =
>> >>> 662.19
>> >>> 1080TI 1 x GPU: | ns/day = 121.95 seconds/ns =
>> >>> 708.48
>> >>>
>> >>> FACTOR_IX_PRODUCTION_NPT - 90,906 atoms PME
>> >>> -------------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 123.86 seconds/ns =
>> >>> 697.55
>> >>> 1080TI 1 x GPU: | ns/day = 113.46 seconds/ns =
>> >>> 761.48
>> >>>
>> >>> CELLULOSE_PRODUCTION_NVE - 408,609 atoms PME
>> >>> --------------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 26.73 seconds/ns =
>> >>> 3232.72
>> >>> 1080TI 1 x GPU: | ns/day = 26.14 seconds/ns =
>> >>> 3305.84
>> >>>
>> >>> CELLULOSE_PRODUCTION_NPT - 408,609 atoms PME
>> >>> --------------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 25.30 seconds/ns =
>> >>> 3414.99
>> >>> 1080TI 1 x GPU: | ns/day = 24.72 seconds/ns =
>> >>> 3495.08
>> >>>
>> >>> STMV_PRODUCTION_NPT - 1,067,095 atoms PME
>> >>> -----------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 16.17 seconds/ns =
>> >>> 5344.36
>> >>> 1080TI 1 x GPU: | ns/day = 15.09 seconds/ns =
>> >>> 5727.20
>> >>>
>> >>> TRPCAGE_PRODUCTION - 304 atoms GB
>> >>> ---------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 1191.46 seconds/ns =
>>
>> >>> 72.52
>> >>> 1080TI 1 x GPU: | ns/day = 1301.05 seconds/ns =
>> >>> 66.41
>> >>>
>> >>> MYOGLOBIN_PRODUCTION - 2,492 atoms GB
>> >>> -------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 490.54 seconds/ns =
>> >>> 176.13
>> >>> 1080TI 1 x GPU: | ns/day = 448.95 seconds/ns =
>> >>> 192.45
>> >>>
>> >>> NUCLEOSOME_PRODUCTION - 25,095 atoms GB
>> >>> ---------------------------------------
>> >>>
>> >>> 2080 1 x GPU: | ns/day = 10.89 seconds/ns =
>> >>> 7935.83
>> >>> 1080TI 1 x GPU: | ns/day = 10.14 seconds/ns =
>> >>> 8520.32
>> >>>
>> >>> Note if you compare against 4 GPUs with no additional cooling the
>> >>> clocking down of the 2080s is obvious.
>> >>>
>> >>> JAC_PRODUCTION_NVE - 23,558 atoms PME 4fs - Runnin 4 independent
>> >>> calculations at once.
>> >>>
>> >>> 2080
>> >>> [0] 1 x GPU: | ns/day = 761.27 seconds/ns =
>> >>> 113.49
>> >>> [1] 1 x GPU: | ns/day = 676.54 seconds/ns =
>> >>> 127.71
>> >>> [2] 1 x GPU: | ns/day = 649.64 seconds/ns =
>> >>> 133.00
>> >>> [3] 1 x GPU: | ns/day = 441.83 seconds/ns =
>> >>> 195.55
>> >>>
>> >>> 1080TI
>> >>> [0] 1 x GPU: | ns/day = 776.57 seconds/ns =
>> >>> 111.26
>> >>> [1] 1 x GPU: | ns/day = 779.75 seconds/ns =
>> >>> 110.81
>> >>> [2] 1 x GPU: | ns/day = 773.42 seconds/ns =
>> >>> 111.71
>> >>> [3] 1 x GPU: | ns/day = 747.29 seconds/ns =
>> >>> 115.62
>> >>>
>> >>> So in summary:
>> >>>
>> >>> 2080 works with AMBER 18, gives the correct answers in provisional
>> >>> tests and gets performance equivalent to a 1080TI as long as you
>> don't
>> >>> put more than 2 in a box without some kind of custom cooling
>> solution.
>> >>>
>> >>> Of course this is 'modern' NVIDIA so price inflation is the name of
>> >>> the game so while the performance matches the performance per dollar
>> >>> is significantly worse. 1080TI Founders MSRP was $699, 2080 Founders
>> >>> MSRP is $799 so performance per $ has decreased approximately 15%.
>> >>>
>> >>> All the best
>> >>> Ross
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> _______________________________________________
>> >>> AMBER mailing list
>> >>> AMBER.ambermd.org
>> >>> http://lists.ambermd.org/mailman/listinfo/amber
>> >>
>> >>
>> >> --
>> >> Vytvořeno poštovní aplikací Opery: http://www.opera.com/mail/
>> >
>> >
>> > _______________________________________________
>> > AMBER mailing list
>> > AMBER.ambermd.org
>> > http://lists.ambermd.org/mailman/listinfo/amber
>>
>>
>> --
>> Vytvořeno poštovní aplikací Opery: http://www.opera.com/mail/
>>
>> _______________________________________________
>> AMBER mailing list
>> AMBER.ambermd.org
>> http://lists.ambermd.org/mailman/listinfo/amber
>>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber


-- 
Vytvořeno poštovní aplikací Opery: http://www.opera.com/mail/
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Oct 08 2018 - 12:00:01 PDT
Custom Search