High DX11 CPU overhead, very low performance.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by PrMinisterGR, May 4, 2015.

  1. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
  2. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    I must disagree i run W10 TP 10076 and latest driver 0401.

    Crossfire works fine in this drivers has NO DX 12 support as DXDIAG reports.

    Look at my post pics:

    http://forums.guru3d.com/showpost.php?p=5063380&postcount=103
     
  3. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    GTA V was a lot worse for me on Win10 with Win10 drivers, lots of stuttering in the city and eventually crashes.
     
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Ok for what? Now you're just arguing for sake of arguing,.. No one would ever want to build a system with high-end gpu and low-end cpu, I'm quite sure about that. Screw driver overhead at this point.

    All those games that run crap with simulated dualcore need a proper quad anyway, its even better with more threads like you saw it yourself.
    BTW just for info; its the same on nvidia, I saw it with 570GTX by q9450 vs 4770k.. Even by older game like COD5 WAW - it can use all 8 threads np (usually 20-40% per thread)..



    Anyway good to see they're starting to improve it further, still needs some dx11 tweaks, I got ~2.43 mill and 2.32 mill @dx11 api tests.

    edit:test results
    http://www.3dmark.com/3dm/6888803?
     
    Last edited: May 7, 2015

  5. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,969
    Likes Received:
    2
    GPU:
    -
    Nice, that's quite a lot tj. o,O :S :D

    Indeed... who would get a high end gpu with a low end cpu, that doesn't work.
     
  6. grandmaster

    grandmaster Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    GTX 980/4GB
    Hi everyone!

    I'm Richard Leadbetter from Digital Foundry. I note that some of my work is being discussed on this thread, and I'd like to thank you all for following up. Lots to digest and think about, and it's heartening that AMD has made some efforts into tightening up DX11 performance - which will hopefully roll out into the official 8.1 driver. I reported all my findings to them months ago and they did say that they were looking into it. Meanwhile, the DX12 results show that in the fullness of time this should not be an issue.

    A few comments are worth following up on though. Specifically that it's not an apples to apples comparison because we aren't comparing equivalent AMD and Nvidia GPUs. Well, the whole issue kicked off for us exactly because we tested equivalent GPUs and saw very, very different results on less powerful CPUs. Of course an R9 270X will hit CPU limits faster than a 750 Ti (more on that later), but the first time we saw the issue was when we were comparing R9 280 and GTX 760. Up until the 960 came out and the 760 was EOLed, these were equivalent GPUs.

    We made a video highlighting the issue, showing a dramatic loss of perf on the R9 280 with an i3 compared to an i7, while the GTX 760 pretty much holds steady.

    https://www.youtube.com/watch?v=lQzLU4HWw2U

    And it's not just Call of Duty. Check out this screenshot of performance on The Crew from Ubisoft:

    http://images.eurogamer.net/2013/articles//a/1/7/3/2/6/6/2/2.bmp.jpg

    Equivalent GPUs, same CPU and a loss of performance on AMD where we need an i5 to match performance we get from an Nvidia card on an i3. I should point out that similar to COD, this only happens in draw-intensive areas. When you're out of the city in more open areas, AMD perf is equivalent to Nvidia. But it's clear that there are issues under load.

    Next up, Far Cry 4. GTX 960 vs R9 285. Again, two equivalent GPUs. More equivalent if you like, as both are 2GB cards. Here we see that while the perf drop overall isn't anything like as bad as The Crew or COD, the experience isn't great because of the frame-time spikes. Once again the 960 on the Core i3 remains unaffected at this point in the bench. I should apologise at this point - the 4970K label should be 4790K (it was a long week!)

    http://images.eurogamer.net/2013/articles//a/1/7/3/2/6/6/2/3.bmp.jpg

    Generally speaking, I'd say an i3 with a GTX 960 is a viable pairing but as much as I like the card and it's amazing perf/price/VRAM advantages, I couldn't recommend pairing an i3 with an R9 280.

    But equally it should be pointed out that there will be scenarios and games where an Nvidia card will be affected too, not every title is as black and white as The Crew and COD. For example, GTA 5 on equivalent cards and max standard settings also showed issues on both Nvidia and AMD:

    http://i.imgur.com/npRJC7S.jpg

    OK, so a lot of people are asking why we are comparing GTX 750 Ti to 270X and 280 etc. Well, the data in this thread is derived from our buyer's guides. In the UK at least, the R9 270X is often on sale for just a few pounds more than a 750 Ti. Meanwhile the R9 280 offers a stratospheric leap in perf over the 750 Ti, and it's often on sale for just £25 more. As I said, the price/perf/VRAM combo offered by the 280 is very, very compelling!

    The point we are trying to make is that while it is really tempting to spend that little extra cash to get the better card (based on benchmarks taken using an i7 usually), if you're building a budget PC with a budget processor like an i3, the chances are that you'll get really inconsistent performance. The same recommendation may hold true were it a more powerful Nvidia card, but the cheapest upgrade you can get over the 750 Ti is £55 more expensive, and still only has 2GB of VRAM, so it's not really a consideration.

    It's a whole different ballgame if you have an i5, where the R9 280 seems to be a no-brainer at £130-£140, and where the driver overhead issue doesn't seem to be a problem in actual gameplay terms.

    But generally, I think the mistake the tech press (myself included) have made in the past is reviewing entry-level and mainstream GPUs using the very best CPUs money can buy, rather than testing using the processors more likely to be paired with them. We sort of assumed that driver performance would be equivalent, but as we've seen, it isn't - and it can impact the gameplay experience.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Simple logic: If NVIDIA can give 50% more draw calls with half the CPUs (because that's what's happening even with the 1018.1 driver), there is a huge room for improvement, especially for lower end systems.
    See the benchmarks we have posted. Even with very fast CPU's, the driver overhead matters SO MUCH with frametimes.

    Go to the first post. The problem manifests itself on lower systems with i3's and Athlons/FX-4xxx's, coupled with lower GPU's. I have by no means a top end GPU at this point, I have a monstrous CPU (compared to the GPU) and the damn thing still matters.
     
  8. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    I can sadly say this.
    I tested Watch Dogs with a Athlon II X4 640 @ 3.5ghz with a HD 5850 1gb ad GTX 650ti 1gb. Both are very close in performance.

    No shocker here, but cpu usage was lower on the 650ti in watch dogs. All four cores were around 75-80% avg where the cpu usuage on the HD 5850 was 95% on all cores.
    the 650ti had gpu usage of 98% most of the time where the HD 5850 was dropping down 75% and barely was around 90%.

    Game was much more smooth on the 650ti then the hd 5850.
    And the same goes for dying light as well.

    Tested both games @ 1680x1050 res.

    I noticed higher cpu usuage in BF4 when I using a HD 6950 2gb a few months ago vs my 650ti @ 1080p. Wasnt alot that would be noticable on a 6 core @ 4.5ghz.
     
  9. DiceAir

    DiceAir Maha Guru

    Messages:
    1,369
    Likes Received:
    15
    GPU:
    Galax 980 ti HOF
    Funny thing I tested when mantle in Battlefield was running fine. i installed one of my r9 280x in a old q6600 with 4GB DDR2 and normal HDD on 1080p 60hz. On dx11 i got 40fps and on mantle was about 80fps maybe even more so I got a huge jump in fps then. So it shows that even a old cpu like that can be enough for high end games.
     
  10. Yecnot

    Yecnot Guest

    Messages:
    857
    Likes Received:
    0
    GPU:
    RTX 3080Ti
    Chill out bruh
     

  11. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,969
    Likes Received:
    2
    GPU:
    -
    He's chill bro.
     
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Just saying since he's now making a deal around lower end cpus.
    All fine and dandy but there is a limit even there.

    And I get his point and I saw my self AMD gpu needs stronger cpu, I posted WatchDogs driver overhead and even a 4770K @ 4.6ghz lags behind 10-15fps vs nvidia counterpart (290x vs 780ti), but then again its a nvidia title and those usually do worse on AMD.. New Dirt Rally is highly multi threaded (AMD title) and that runs fine 290x can beat nv 780ti at same cpu HW.



    Anyway nice to see you improved your rig so much :thumbup:, I still remember when you had a really low budget system :)
     
  13. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    I did benchmark the VSB drivers on my system.
    The results on 3dmark overhead Test

    Omega:
    [​IMG]

    15.4B
    [​IMG]

    VSB Drivers
    [​IMG]



    Project Cars
    [​IMG]

    -.-!
     
  14. macmac9

    macmac9 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Asus 280X
    First I'd like to thank you all working on DF for bringing us the quality content you do!
    Now that's done, would like to ask you, if/when you take a look at project cars's pc performance, could you consider comparing W8 and W10+15.2 drivers? Just to show people that there *is a better place, and it's W10, lol.
    I'll just leave my humble videos here:
    https://www.youtube.com/watch?v=XzFe5OOHZko
    https://www.youtube.com/watch?v=4U3h3QfsRho
    yeah, they're crap but got the job done.

    *when they get it stable enough for production use.
     
  15. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe

  16. macmac9

    macmac9 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Asus 280X
    That's my video, and it is most certainly possible :)

    but you're getting zero boost, maybe that phenom just falls under all the AI you had on track?
     
    Last edited: May 8, 2015
  17. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    The internet is getting small XD
    That could happen, but even with 19 or less on SPA, the framerate is below 30fps and GPU usage at 50%

    I've seen games with much more draw distance, object density and detail running so much better than this.

    Phenom II X4 is obviously holding my system back, but this thing, is something more.
    I remember to benchmark the early acess with a GTX 770 lended by a friend and i got +20FPS than i got with the HD 7850.
    And i had my GPU Usage lower than 90% on my HD 7850

    If Phenom II X4 was holding me back, it would hold the GTX 770 too.
    Either its the tipical overhead AMD problem or it is something more from SMS devs.
    I bet on both, but especially i blame SMS for this, their other games was exactly the same story for AMD users.

    well time to wait for some changes.
    Curious to see who will fix it, amd or devs, lets find out how have the blame.

    This kind of reminds me of the poor and buggy performance of the metro last light on launch, it was awful on AMD, barely run it above 20FPS XD, everyone was putting the blame of AMD drivers, then after 3 days, the devs launched a patch, and everything was working fine at 60FPS in the same setttings XD

    Same goes for dying light, we have been complaining about the poor performance and buggy CF support of the dying light, suddenly, techland releases a patching fixing CF, so it was techlands fault all the time maybe XD
     
  18. Yecnot

    Yecnot Guest

    Messages:
    857
    Likes Received:
    0
    GPU:
    RTX 3080Ti
    It was low budget when it came out half a decade ago :p

    Btw, how's watch-dogs doing lately? All the official benchmarks (techspot, g3d) are dated from the release and those make it seem not worth trying.
     
  19. macmac9

    macmac9 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Asus 280X
    It really is a small world if you're interested in amd driver overhead and project cars xD

    Indeed, if it ran faster with nvidia, then it's the drivers. Windows 10 to the rescue! (hope it doesn't crash on you and bury you under it, lol.)

    It has been proven it ain't project cars's fault that it runs up to 50% worse in W8 than 10 :) Now whose is it? Amd? I would bet this horse. And it's fool to point fingers anyway (yeah, I see what I just did) but the problem has been found and the only way to go is to solve it together. (awwww, how sweet)
     
  20. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    Well i just hope it get fixed, i don't have a problem to run at médium settings, but that framerate in a some sort of simulator is far from ideal.

    About the fault being from AMD, not sure how we can conclude that now :)
    Lets wait a lil bit longer, hope it gets fixed fast.
     

Share This Page