Article: GPU Compute render performance benchmarked with 20 graphics cards

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 27, 2020.

  1. JarJarBinks

    JarJarBinks New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    PNY Quadro P4000
    Why no Quadro's?
     
  2. sykozis

    sykozis Ancient Guru

    Messages:
    21,799
    Likes Received:
    1,056
    GPU:
    MSI RX5700
    HBM isn't owned by AMD. It was developed with SK Hynix. AMD has no control over who uses it.

    FreeSync is just the name AMD gave to their implementation of Variable Refresh Rate. VESA calls it Adaptive-Sync. Adaptive-Sync was adopted by VESA and added to the DP1.2a standard. If you want to get technicial, NVidia doesn't support AMD's FreeSync....nor any FreeSync monitor. NVidia is supporting VESA's DP1.2a interface standard.

    AMD doesn't benefit directly from Adaptive-Sync. Conversely, NVidia does benefit directly from CUDA.
     
  3. Kaarme

    Kaarme Ancient Guru

    Messages:
    2,356
    Likes Received:
    984
    GPU:
    Sapphire 390
    AMD isn't a memory manufacturer. They needed one to actually get things done outside of a laboratory. I never said they have control. If they did, would Nvidia be using it? However, Nvidia seems to have no compunctions using something AMD developed.

    Reread what I wrote. And of course they both benefit from adaptive sync. It's a recognised technology valued by lots of gamers. Gamers need good GPUs. The huge pool of adaptive sync screens would be a small pool without AMD because Nvidia wants you to pay 100 bucks extra for the small module inside, in addition to the other technology the screen must contain (like a sufficient panel). AMD doesn't. Consequently if a gamer wanted adaptive sync, they needed to either pay significantly more for Nvidia video card + expensive Gsync screen or less for an AMD GPU + any random adaptive sync screen. Since AMD GPUs have been less desirable for a while now, the difference in price with an adaptive sync package would have worked to compensate. Now Nvidia finally allows people to use a non-G-sync screen as well for adaptive sync, so people can go the Nvidia way without paying as much.
     
  4. xg-ei8ht

    xg-ei8ht Ancient Guru

    Messages:
    1,803
    Likes Received:
    20
    GPU:
    1gb 6870 Sapphire
    Hi again.

    I'm using a RX470 and the setting is there. (COMPUTE GPU WORKLOAD AMD)

    Not sure about Vega though.

    If you go to the right side cog, click and then click graphics and scroll down to advanced and click .it should be there under GPU WORKLOAD.

    Worth a look.
     

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,700
    Likes Received:
    1,292
    GPU:
    Aorus 3090 Xtreme
    There is less chance he will miss your post if you reply to him.
     
  6. sykozis

    sykozis Ancient Guru

    Messages:
    21,799
    Likes Received:
    1,056
    GPU:
    MSI RX5700
    That setting isn't available to everyone.

    I have an RX5700 and the setting doesn't exist for me.

    It's quite possible that the setting only exists for certain GPUs....
     
  7. Denial

    Denial Ancient Guru

    Messages:
    13,326
    Likes Received:
    2,827
    GPU:
    EVGA RTX 3080
    There are dozens of technologies Nvidia shared and contributed that AMD utilizes. There is also examples of technologies that AMD/ATi developed that Nvidia couldn't use. I don't really think pointing the finger at one and saying it's worse really means much when you start stacking everything up.
     
  8. jura11

    jura11 Ancient Guru

    Messages:
    2,519
    Likes Received:
    584
    GPU:
    GTX1080 Ti 1080 FE

    Hi there

    Unity is targeting different audience than Blender, V-RAY, Indigo Renderer etc and other renderers as you know, Unity is mostly used for game development although can be used too for archviz visualisation or visualisation of products etc

    There are no test scene for Unity which most of reviewers could use and similarly this applies to Unreal

    GTX 1080 test not sure there, I can only comment on my tests, my Asus RTX 2080Ti Strix will render Blender Classroom in 75 seconds with OptiX and 4*GPUs(RTX 2080Ti Strix, GTX1080, GTX1080 Ti and GTX1080) will finish that in 42-45 seconds without the OptiX and Asus RTX 2080Ti Strix without the OptiX will render that in 2 minutes and 19 seconds that's in Blender Cycles, in E-Cycles same scene it will render in 1minute and 44 seconds that's on Asus RTX 2080Ti Strix and without the OptiX and same scene with 4*GPUs(Asus RTX 2080Ti Strix, GTX1080, GTX1080 and GTX1080Ti) will render in 32-36 seconds and just two GTX1080 will render that scene in E-Cycles in 1 minute and 29 seconds

    Hope this helps

    Thanks, Jura
     
  9. haste

    haste Maha Guru

    Messages:
    1,204
    Likes Received:
    368
    GPU:
    GTX 1080 @ 2.1GHz
    A lot of indie developers, who use Blender for example, also use Unity. All of these are just dev tools with GPU accelerated lighting. There is no reason to exclude Unity from the comparison.

    Benchmarking Unity's GPU lightmapper is not hard. Just download a free scene (or create a simple one from boxes) and make sure the lighting takes at least 30 seconds. From my experience the internal lighting statistics are pretty good for benchmarking. My GTX1080 peak performance is always at 280mrays/s +- 2mrays/s.
     
  10. Gomez Addams

    Gomez Addams Member Guru

    Messages:
    178
    Likes Received:
    97
    GPU:
    Titan RTX, 24GB
    I am not so sure it makes sense for Nvidia to allow anyone else to use the CUDA API because I do not know where they would see a benefit from doing so. The only thing that would happen is they could potentially lose market share. As I wrote, if Google wins their case against Oracle that becomes a moot point. Then the question becomes does AMD (or a third party) want to support CUDA on their GPUs? Conversion tools are not the same as supporting it because the conversion is not 100%. It is a one-time thing and then you have to tweak things here and there. This means, in the end, a developer has to choose a direction. There is a LOT of infrastructure involved for full support and it will be very interesting to see what AMD does.
     

  11. Kaarme

    Kaarme Ancient Guru

    Messages:
    2,356
    Likes Received:
    984
    GPU:
    Sapphire 390
    Sure. I wasn't really writing the comment from such a rational point of view. AMD could have kept HBM under heavy protection and licensing scheme, but it's possible it wouldn't have ever progressed anywhere like that, becoming another RDRAM. Hynix and others probably wouldn't have started to produce it if there weren't enough customer potential (which AMD alone isn't). Nvidia could only go for their closed technologies because of its highly dominating position. But even so there seemed to be a limit with adaptive sync.
     
  12. Gomez Addams

    Gomez Addams Member Guru

    Messages:
    178
    Likes Received:
    97
    GPU:
    Titan RTX, 24GB
    The really big difference in scenarios here is HBM is a hardware thing and those require even more infrastructure support, especially if you want affordable parts. What is rather ironic about this is software is probably the key to Nvidia's success in the HPC and AI arenas at the moment and they give it all away for free. Several reports say AMD's GPUs are superior on OpenCL and OpenACC performance so, obviously, there is a little more too it.
     
  13. pharma

    pharma Ancient Guru

    Messages:
    1,690
    Likes Received:
    498
    GPU:
    Asus Strix GTX 1080
    HBM was introduced by AMD and Hynix, so doubt whether and licensing scheme/protection could have materialized since these companies belong to JEDEC. Though Hynix did agree to let AMD produce the first GPU with HBM in 2015. Samsung introduced HBM2 in 2016 and was first used in Tesla.
     
  14. Athlonite

    Athlonite Maha Guru

    Messages:
    1,315
    Likes Received:
    36
    GPU:
    Pulse RX5700 8GB
    Ah Ok I'm using Polaris 20 (RX580) maybe Navi doesn't need this switch it just does it automatically
     

Share This Page