High DX11 CPU overhead, very low performance.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by PrMinisterGR, May 4, 2015.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I didn't notice all that. Just the VRAM usage and the GPU usage in Dirt (or whatever Codemasters game that was). I'm much more sceptical now, although his numbers in the specific videos (temps etc) look ok.
     
  2. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Well you can't really bull**** a video that easy unless you are working with the CIA. (opps now i am conspiracy theorist) :D

    I did get angry because of the r9 390/x videos, but usually i find his benchmarks better than what the others because it shows the real time fps and you can see how the cards perform in different scenarios, unlike those benchmark that show you some graphs of the fps.

    Digital foundry is also good, but the fact that they usually run the reference r9 290/x in their tests is what pisses me off, since they know that those cards throttle and perform terribly compared to the non-reference designs, which in turn gives nvidia cards even a more clear advantage.
     
  3. SMS PC Lead

    SMS PC Lead Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    Titan X SLI
    You are more than welcome to PM me (user "SMS PC Lead") over at the official forums - http://forum.projectcarsgame.com ...
     
  4. SMS PC Lead

    SMS PC Lead Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    Titan X SLI
    That video demonstrates the bottleneck nicely - the Fury X with the single car (very few API calls) at the beginning of the video is at 99% - then at 3.09 it reaches 91% on a corner with very little in view (few API calls again).

    The game there (like ours) has a certain over-head in that the off-screen things like shadows, rear view mirror,envmaps and whatever else they render, contribute a constant(ish) amount of API calls - let's say 4000-5000 draw-calls. Now if the (single threaded) CPU usage of the driver thread reaches 100% at 6000 draw-calls then *any* further calls above and beyond this level will cause the GPU utilisation to drop, because the GPU is consuming the calls faster than the CPU can provide them, leading to 'gaps' ("bubbles" of idle time) in the GPU's time-line.

    The solution to this is of course to make the driver thread's processing of API calls faster... (or to make some of the operating system functions which the driver calls faster by optimisation or by improving the underlying architectural model to reduce overhead like the increments in WDDM generally do)
     
    Last edited: Jul 7, 2015

  5. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    We didn't know the polling rate in monitoring if they use a high default one we are unnable to see micro-stutter reflected in OSD numbers.

    I.E. in Afterburner it's very different to set monitoring polling rate to 500 ms than to use the default 2500 ms.Take a look at old threads about GTA V and crossfire when polling rate was lowedered to 1000 or 500 the posted AB graph show crazy varience in GPU usage and frametimes, framerates.

    Very fast and very pronounced GPU usage and related to FPS obviously very fast variance...micro-stutter.

    A high poll rate didn't show that so clearly or at all.
     
  6. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    So that means the game stutters when the driver thread is maxed out? Is that connected to GPU usage?
     
  7. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    The GPU lowers usage waiting for the DX11 bottleneck is solved.

    Idle time.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Actually you can easily bullsh*t video. If nVidia records 30fps but merges 2~3 adjacent frames which are rendered during 33ms period belonging to final frame of video, you get motion blur like effect and smooth playback.
    But if you compose video from frames which are rendered last before you hit 33ms mark for next frame of video, you will always get stutter.

    I can see always stutter on 60Hz with any graphics card unless motion blur is present. That is why I have 120Hz screen and on that it was always ok even with HD7970.

    And btw, both sides in those videos have stutter. At times it is only more noticeable. If it was captured at 120fps video, you would see much less stutter.

    Then there are mathematical models which allow to record stuttering game and resulting video will be butter smooth. It is based on recording time between each frame rendered by GPU and then shortening or prolonging motion vectors in video based on early or late delivery of frame.
     
  9. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
  10. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    I think that PCPer did a comparison of FRAPS frametimes and FCAT frametimes, and came to the conclusion that there is almost no difference on single cards, but a significant difference with SLI/Crossfire. For our amateurish purposes, FCAT is overkill.

    BTW great link. :thumbup:
    I was looking for something like that to do some more testing.
     

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That fits with the best performance tip I have found by myself for your own game: Lower track detail to medium, and turn off the grass. Can you tell us what your internal performance profiler shows that is the bottleneck? We're after it here for almost 7 months now :p

    Shouldn't that mean that at those moments, the driver thread hits 100% CPU usage?

    What is your opinion of what's happening with the 2nd AMD driver fork? There are marked improvements in Windows 8.1, and they seem to be much less pronounced in Windows 7. Is there any chance that a driver that already supports WDDM 1.3, to start supporting it "better" (and hence the improvement only on Windows 8.1 up)?

    You don't need any extra app. You just need Excel and really basic math skills. FRAPS gives you everything.
     
  12. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    For an excel/powerpoint noob the app posted is a dream FRAPS companion to show graphs from log files.

    :)
     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Sorry if I sounded like a douche, it is that I prefer to be on hand with numbers in general.

    EDIT: I'm keeping the App page, seems to make some things much faster, I was an idiot.
     
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    So much wasted GPU power under the hood.
    It's probably exacerbated by using "only" 4.4 GHz Sandy Bridge.
     
  15. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    fury x is underutilized even with i7-4790K@5.0 ghz :D @1080p
    there is just no procesor which can feed this card with 1 thread at lower resolutions

    it can be easily demonstrated on my R9 290X too :-/

    thats why fury x is fully comparable to 980ti in 4K but loosing in 1080p
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I guess you are sarcastic about the performance of the specific CPU. We are on the same page if you are. I have a 2600k@4.5 and even Skylake makes no sense for me for desktop usage. This even overclocks better, and the Skylake parts have to be 4.2GHz+ to get the same performance.

    Well, NVIDIA is doing it somehow. Look at the utilization of the freakin' Titan X. It's like it's stuck at 97-99%.
     
  17. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    funny thing is because of that NVIDIA cards perform better on AMD cpus..
    AMD just need DX12 era to come as quicky as possible as GCN is impotent under DX11
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    GCN seems to fine in the consoles, giving really nice performance with laughable CPUs. If anything it is the DX11 driver that is impotent. And no, DX12 will not negate all the games written (and the ones that will continue to be written) in DX11.
     
  19. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    AMD insists on adding more power to the "engine" of his "car" (GPU) but it uses the same "monkey" (driver) behind the "wheel".

    Nvidia has a less performing "engine" in his "car" but his "driver" is not a monkey but a driver.
    Nvidia still wins even with a less margin.

    The DX11 AMD driver performance is not magically solved or improved adding more raw power to the GPU.

    In fact this lack of performance in DX11 is even more evident ( + time at lower GPU usage) at lower res (-4K) when more "wait time" must be done in GPU core due to improved GPU power and bus speed.
     
  20. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    yep card computing performance is good
    and you can see that 290X keeping with 980 in games with mantle which is amazing for 2 year old card (or in very gpu intensive games like crysis 3 or witcher 3)

    but for games like project cars or anything from blizzard it just fail
     

Share This Page