1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: Gigabyte GeForce RTX 2080 GAMING OC 8G

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 20, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    33,927
    Likes Received:
    2,929
    GPU:
    AMD | NVIDIA
    Onwards to the next one, join us Gigabyte as released their GeForce RTX series graphics cards as well, in this review we look at their brand new GeForce RTX 2080, and in specific the GAMING OC 8G edit...

    Review: Gigabyte GeForce RTX 2080 GAMING OC 8G
     
    XenthorX likes this.
  2. dmity84

    dmity84 New Member

    Messages:
    8
    Likes Received:
    0
    GPU:
    RTX 2080 FE
    If it's of any use to anyone, my EVGA 1080 ti SC2 stock settings (and running on x8 slot), does VRAY benchmark in 66s.
     
  3. SniperX

    SniperX Member

    Messages:
    38
    Likes Received:
    14
    GPU:
    MSI RTX2080 Duke OC
    Considering the following, I'm pretty impressed that the 2080 can trade blows with the Titan Xp...

    Titan Xp vs RTX 2080
    Transistor count 12 Billion vs 13.6 Billion
    CUDA Cores 3840 vs 2944
    ROPs 96 vs 64
    Memory Size 12 GB vs 8GB
    Memory Bus 384-bit vs 256-bit
    Memory Bandwidth 547 GB/s vs 448 GB/s
    FP Performance 12.0 TFLOPS vs 10 TFLOPS
    TDP 250 Watts vs 215 Watts
    Launch MSRP $1200 vs $ 799
     
    XenthorX likes this.
  4. Caesar

    Caesar Master Guru

    Messages:
    317
    Likes Received:
    70
    GPU:
    GTX 1070Ti Titanium
    The real question is why to buy!?

    .............
    PLEASE DO NOT ANSWER FOR : RAY TRACING...............o_O

    [​IMG]



     

  5. buhehe

    buhehe Member Guru

    Messages:
    140
    Likes Received:
    14
    GPU:
    R9 290 Tri-X Vapor-X
    Except that nobody in their right mind compares this card to the Titan Xp - which was hilariously bad value at that price - but to the 1080ti.
     
    alanm likes this.
  6. fr33k

    fr33k Ancient Guru

    Messages:
    1,916
    Likes Received:
    27
    GPU:
    ASUS STRIX RTX2080
    i think its been pretty clear so far, most people agree that there is no reason to buy if you're on a 10xx card already unless you want raytracing.
     
  7. SniperX

    SniperX Member

    Messages:
    38
    Likes Received:
    14
    GPU:
    MSI RTX2080 Duke OC
    You're missing the point
     
  8. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,127
    Likes Received:
    230
    GPU:
    EVGA XCUltra 2080Ti
    @Hilbert Hagedoorn Can you run/record the Star Wars and Futuremark raytracing demo with a RTX 2080 as comparison please? Want some ideas of lower RT core count and how viable other card than Ti are.

    Glad to see the Gigabyte 3 fans cooling performing well at full load, i bought the Windforce 2080-ti.

    Nvidia is really trading blows with AIB partners... Won't end well for them at this pace.
     
  9. jura11

    jura11 Ancient Guru

    Messages:
    1,755
    Likes Received:
    144
    GPU:
    GTX1080 Ti 1080 FE
    Hi there

    My EVGA GTX1080Ti FE with 2113MHz do this V-RAY benchmark in 62 seconds, difference between the OC and normal stock clocks is 4-6 seconds, with stock clocks at 1911MHz time is 66-68s and with 2113MHz OC time is 62s

    Difference between 8x and 16x is not so big there in rendering, have run same setup 3*GPUs with 5820k and now with 5960x and difference in GPU based renderers is minimal or in some cases is zero

    Yours V-RAY result is pretty good there, what clocks are you running at stock?

    My 1080Ti at stock clocks is running 1911MHz

    Hope this helps

    Thanks, Jura
     
    Killian38 likes this.
  10. dmity84

    dmity84 New Member

    Messages:
    8
    Likes Received:
    0
    GPU:
    RTX 2080 FE
    Hi,

    i see, i thought x8 would have some effect here, it does a little in 3dmark and gaming, but it's not that bad.

    Stock core boosts to 1898 or maybe 1911 max at the beginning, memory is at 5005 i think according to Precision XOC.
    With +55 core and +530 memory, it was 62 seconds. I probably lost the silicone lottery as it won't go over 50mhz clock in 3dmark the max i get with boost is 1974mhz.
     

  11. DSC2037

    DSC2037 New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    Nvidia GTX 1070 / 8
    Thanks for the review.

    Can you please explain us your choice of V-Ray benchmark over let's say Blender GPU benchmark?
    V-Ray benchmark is know that it doesn't work on AMD. When you go to their benchmark list, you will not find any Radeons or Vega dedicated graphic cards.Only Nvidia or AMD CPU/APUs or Intel CPUs.
    Sounds very fishy to me.

    Blender is open source, and both Nvidia and AMD team are working on it to optimize their products to the best that they can, so it fair ground for comparison.

    If you go to http://download.blender.org/institute/benchmark/latest_snapshot.html you can see their latest benchmark results.
     
  12. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    33,927
    Likes Received:
    2,929
    GPU:
    AMD | NVIDIA
    Hi DSC2037,

    It is based on user request here in the forums that we added it. One person wants this, the other that. I cannot do it all, so simply picked V-Ray based on an audience request. Also, GPGPU is of limited interest to this reader base in generic ergo I do not want to spend heaps of time on testing it.

    I had a discussion yesterday on the software as I was uncertain what card uses what render path in v-ray. As it seems NV cards will use Cuda whenever available, otherwise drops back to OpenCL.

    I learned that OpenCL is the standard code path for AMD Radeon cards (which are still lacking at this time for reasons of time). So including any measurements (even bad) actually would be a good thing to include, so that AMD can put more work into that as trust me, when we post it, they will notice it and look into it.

    I've been made aware of Blender, but still need to look into it.
     
    yasamoka and Killian38 like this.
  13. jura11

    jura11 Ancient Guru

    Messages:
    1,755
    Likes Received:
    144
    GPU:
    GTX1080 Ti 1080 FE
    Hi there

    Not sure if V-RAY benchmark does use OpenCL, I know their render engine RT does use OpenCL and therefore you can use AMD or any GPU which supports OpenCL, don't have at home any AMD GPU which I could test it in V-RAY etc

    V-RAY benchmark is great for testing multi core/threaded CPU like is ThreadRipper or Intel counterparts as this CPU can use all available cores/threads etc plus with CPU rendering you are not limited by VRAM and CUDA cores but yours RAM and CPU

    V-RAY is still most popular renderer for archviz when it comes to CPU rendering, Corona or AMD ProRender catching up quite quickly in therm of render speeds and quality plus they're cheaper or free

    Regarding the Blender, OpenCL is usually slower than Nvidia CUDA plus there are few limitations of use of OpenCL, if its fair ground to compare both AMD vs Nvidia in Blender hard to say there, I use Blender but mostly with Cycles and CUDA and used in past with AMD OpenCL and used Nvidia with OpenCL renderers as well

    LuxMark has been mostly used for testing or benchmarking the GPU performance in rendering but now are more and more used CUDA based renderers like is Octane, Maxwell, IRAY, SuperFly etc

    Plus Nvidia OpenCL is not the best if you compare GTX1080 vs Vega64 in LuxMark or other OpenCL renderers

    Few years back I used only AMD GPUs for OpenCL rendering or works as their performance has been and still is best when it comes to OpenCL, newer Turing generation of GPUs have bit better OpenCL performance but still AMD didn't released any new GPU this year and Vega64 is bit old GPU for comparison with Nvidia RTX range

    Hope this helps

    Thanks, Jura
     
  14. jura11

    jura11 Ancient Guru

    Messages:
    1,755
    Likes Received:
    144
    GPU:
    GTX1080 Ti 1080 FE
    Hi @Hilbert Hagedoorn

    Corona and V-RAY are standards in archviz rendering and they're more and more used in movies and other stuff, agree with you Guru3D readers have limited "desire" or interest in GPGPU benchmarks, but still good to have included in yours benchmarks as both can be used with CPU and both can utilise all cores/threads of ThreadRipper or Intel counterparts

    In theory you can try disable CUDA in Nvidia Control panel and test there if V-RAY benchmark can or would start with OpenCL fallback

    Yup, AMD does use OpenCL as there is no CUDA cross compiler which would allow run AMD with CUDA apps, this AMD CUDA cross compiler Otoy promised to bring but looks like they will be not bringing this anytime soon

    Regarding the OpenCL performance, you can try for fun LuxMark and GTX1080 vs Vega64 and you will find, Vega64 is bit faster in LuxMark, not sure how fast RTX is fast there, but I would suspect they would lot faster than older Pascal or Maxwell generation

    Blender I would include as V-RAY or Corona will and would use all available cores/threads in rendering or benchmark plus you test there like CUDA or OpenCL performance or you can use AMD ProRender which is OpenCL renderer and you don't need to switch anything etc

    Blender is more and more used in game development or archviz and general rendering just due its free and offers great features which you can find in 3DS Max or Cinema4D which cost crazy money

    Hope this helps

    Thanks, Jura
     
  15. devastator

    devastator Active Member

    Messages:
    78
    Likes Received:
    18
    GPU:
    rtx 2070
    that funny rtx= rushed to xpensive :)
     

Share This Page