High DX11 CPU overhead, very low performance.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by PrMinisterGR, May 4, 2015.

  1. haz_mat

    haz_mat Guest

    Messages:
    243
    Likes Received:
    1
    GPU:
    1070 FE
    I agree with the sentiment that AMD's DX11 driver needs work. That being said, I am still pretty happy with my 290x but I definitely think it could be better.

    You can see it in a lot of performance graphs compared compared against similar nVidia cards. A given AMD gpu will be churning along keeping relative parity with the comparable nVidia gpu, some change in the scene occurs and the AMD card takes a hit on the framerate while the nVidia remains mostly unaffected. Sometimes its the opposite, a scene change occurs and the nVidia card sees a noticeable gain in framerate while the AMD card does not.

    The following graphs are all pulled from HardOCP's recent performance review on GTAV.

    Exhibit A - Note the plot for the 280x from time index 190-210 and 360:
    Link to HardOCP graph

    Exhibit B - Note the plot for the 290x around times 185 and 360:
    Link to HardOCP graph

    Exhibit C - Note the plots for both 290x and 290 around times 180-200 and 370-420:
    Link to HardOCP graph

    In the last graph, I find the disparity towards the end of the benchmark to be interesting. The nVidia cards were roughly keeping parity with the AMD cards from around time 320-350 and then snap back to the 45-50 fps range while the AMD cards continue to limp along around 35fps until they start to catch up again at the end of the benchmark. I believe something is happening in the scene during those times that nVidia is handling better than AMD.

    Granted, GTAV is still new and will likely see performance improvements from Rockstar in the future. However this title was quite polished at launch by most standards these days so I think these benches still have some credence. This sort of pattern can be seen in other games as well, these were just fresh on my mind so I posted them - but I would encourage anyone to spot this pattern in other benches and share their findings.
     
    Last edited: May 4, 2015
  2. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    Command lists weren't supported because they were and still are useless for the majority of use cases. Civilization V was the exception, not the rule.

    It would indeed be a surprise if it turns out that Nvidia's lower overhead has to do with the drivers supporting a feature that almost no game uses.

    DX12 by itself is no magic pill, but many engines have already announced support for it. Unity supporting DX12 will make it relatively easy for alot of indie games to support it.
     
  3. macmac9

    macmac9 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Asus 280X
    It doesn't start if I rename the exe, just get a steam notification of missing executable. (yes, I'm launching the exe, not from steam)
    Got a similar boost in 3dmark overhead test though. Dx11ST around 20k draw calls per frame boosted to around 30k/frame by changing 15.4 drivers to wddm2 ones.
    Any other games known to suffer a lot from high draw call cost?
     
  4. Garwinski

    Garwinski Member Guru

    Messages:
    185
    Likes Received:
    4
    GPU:
    XFX Fury X
    Assassins Creed Unity? Probably Dying Light as well.
     

  5. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    What are you doing? You cannot compare two cards of different architectures that way. That's completely wrong. Look at Maxwell, the GTX980 has 2048 shader processors while the 780Ti has 2808. Clock-for-clock, they're pretty much equal. 750Ti is Maxwell and is subject to the same analysis.

    Also, please forget about GPUBoss. It's just about completely useless.

    You can compare 960, 970, 980 together, or 680, 780, 780Ti, but you cannot compare GCN and Maxwell...
     
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    The Digital Foundry has actually isolated the issue with specific games and settings, as shown in the original post.



    Read the numbers. I'm not even posting framerates, I am posting percentage losses depending on the CPU. The AMD cards have so huge percentage losses that the gameplay with a much beefier GPU (R9 270x) ends up being worse than with an NVIDIA GPU that has half the hardware.
    In their GTA V arcticle they even state that the GTX 750 Ti provides a smoother experience than the R9 280, when paired with a lower CPU. The R9 280 is almost triple the card hardware wise than the 750 Ti.

    The R9 280 is slower than a GTX 750 Ti when paired with an i3. That's that at the end. Is that "normal", "ok"?
    It is nice to see a poster with two posts, log in to post that we provide "Russian Style Propaganda", without providing a single shred of evidence himself.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    The R9 270x has close to double the framerates than the GTX 750 Ti when paired with a stronger CPU. This fits with the relative hardware each one carries (double the ROPs, double the memory bus width, more than double the shading units etc).
    Also, they are more or less on the same price range, with the R9 270x being the more expensive card.

    The issue is that if you have a lower end system and you get an AMD card because you believe that you at least have a better GPU, the driver is going to drive your performance to the ground, while if you would have gotten the (much weaker hardware-wise) NVIDIA card, your performance would be consistent.

    AMD cards lose much more performance when paired with lower-end CPUs, than NVIDIA cards. You get less than what you pay for if you have a lower CPU.

    As for GPU boss, it was the quickest way to put specs side by side. As far as I saw, they are correct.
     
  8. Despoiler

    Despoiler Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    Sapphire TRI-X R9 Fury
    DX12 is actually the magic bullet. Drivers are not supposed to do multi-threading for the game and API like what is happening with DX11 optional features. AMD did already explore those features and they got negative scaling hence why they aren't supported. AMD is already far far ahead of Nvidia on DX12 performance because DX12 is largely Mantle.

    http://www.anandtech.com/show/9112/exploring-dx12-3dmark-api-overhead-feature-test/3
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    1) Vulkan is largely Mantle, not DX12.

    2) NVIDIA is getting positive scaling, so much so that they have (apparently) adapted their driver scheduler around it and they get almost double the efficiency, looking at their DX11 draw call numbers.
    [​IMG]

    DX12 is not the magic bullet. The absolute entirety of the PC gaming catalogue is on DX9-DX11. Popular DX9 games are out now. Even Microsoft is launching DX11.3 along with DX12, because they literaly said that DX12 requires a lot from gaming programmers/companies.
    DX11 is not going anywhere, and it should be properly supported.
     
  10. DiceAir

    DiceAir Maha Guru

    Messages:
    1,369
    Likes Received:
    15
    GPU:
    Galax 980 ti HOF
    I have a feeling when dx12 comes out AMD might be on top of the charts.
     

  11. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Where is the documentation for DX 11.3? I've heard about it but not from Microsoft directly (publicly).
    I thought it was to bridge the gap for the GPU's that don't support DX12 but can continue being supported with a newer API upgrade DX 11.3
    Once developers get to play with DX 12 and see what it's capable of I'm sure we'll see quick migration to it. In fact it doesn't matter what game developers think it's what the engine developers decide to do.
    If the game engines are backwards compatible (D12/11.3/11/10/9) then sure let them be supported but that sounds like a headache to me.
    A lot of the new AAA games are 64-bit AND DX10/11. DX9 is dying off too slowly in my opinion.
     
  12. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    Isn't it a fact that DX12 is only FULLY supported by the GTX 970 and 980 as of now, with the R9 300 Series also going to support it?

    I know the R9 200 Series and the HD7000 series will be compatible with most DX12 features, but if a game is exclusively DX12 it will only run on a full-DX12 card?

    Thus DirectX 11 will be used even more when DirectX 12 is released, at least for the next 2 to 3 years every game that is DX12 must be backward compatible with DX11 also, because while everyone will have DX12 with Windows 10, not everyone will have a full-DX12 ready card. Isn't that correct? Aren't the ONLY two GPU's, sorry three, I forgot the GTX 960 (and 970 & 980) the only cards FULLY compatible with DX12 and ALL its features?

    Yes, 90% of gamers or more will probably Install Windows 10, and have DX12. But 60%+ of gamers won't be upgrading their current hardware, 50% probably won't upgrade for another 2 years. Thus every DX12 game, must be backward compatible with DX11 cards anyway.

    I agree on one point though, DX9 did take its sweet a** time to die out (it finally is dying out).
     
  13. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    I don't think anyone really knows right now the difference 11.3 and 12 will make, even the Futuremark developers that created the CPU overhead test don't have a clue about the future of 12 or 11.3 actually means for us.
     
  14. semitope

    semitope Guest

    Messages:
    36
    Likes Received:
    0
    GPU:
    iGPU
    maxwell doesn't fully support dx12. the r9 300 series will be first with full support afaik. Still, earlier GPUs will support some dx12 features. Probably down to the 7000 series from AMD.
     
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    It doesn't matter. We're not even talking about framerate, but about performance scaling depending on the CPU used.

    I can't seem to find any (as I can't for DX12 either). I don't know how Microsoft handles their SDK. From the announcement, 11.3 it seems that 11.3 gets some of the extra rendering capabilities of 12, but is a high-level API designed for people who don't want to spend time in the rendering part of their game.
     

  16. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Last edited: May 4, 2015
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    If you compare the relative loss in performance (not the absolute FPS), the Geforce loses 56.35% of its speed from DX12 to DX11, while the Radeon loses 77.64% of it's speed.
    I've read that article, quite eye opening.

    EDIT: For the people asking if cards will need DX12 specific hardware, it seems that some features of DX12 require newer hardware than any card out at this moment, but it won't really matter.
    The consoles are GCN, and there is that. Also the efficiency gains are there for all hardware, and in the surprise of the day, the original GCN harwdare might be just fine for some DX12 features (as DX11.2 is already supported).
     
    Last edited: May 4, 2015
  18. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    What do you not see? At a CPU bound resolution, the nvidia equivalent cards are outnumbering the AMD cards by 10~30 frames! If you wish to troll, do it somewhere else please. I don't wish to waste my time arguing with a fanboy. Fanboys tend to be blind, you don't seem to be too different.

    I updated my post, please do check!
     
    Last edited: May 4, 2015
  19. Krteq

    Krteq Maha Guru

    Messages:
    1,129
    Likes Received:
    764
    GPU:
    MSI RTX 3080 WC
    This should explains a lot. Some of those features are used in DX11.3 (Feature Level 11_3) too.
    [​IMG]

    //edit: This is about Async shaders support.
    [​IMG]
     
    Last edited: May 4, 2015
  20. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Please show me benchmarks where the 270X scores double the framerates of the 750Ti.

    As I said, you cannot compare shader units directly. One shader unit might perform 3 operations of type x per clock cycle, while the other from another architecture might perform 4 instead.
     
    Last edited: May 4, 2015

Share This Page