High DX11 CPU overhead, very low performance.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by PrMinisterGR, May 4, 2015.

  1. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Look what happened to John of RadeonPro. Raptr employed him and that's the end of that. Don't give them ideas, don't let them take asder00 and vbs away from us too :D
     
  2. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    You are wrong, Raptr is not AMD it's a close partner.

    So close partner it's able to include his crapware bundled with AMD drivers!

    :D
     
  3. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    yep total crapware :D
     
  4. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Stop giving Raptr ideas!
     

  5. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    On a more positive note, if they got some decent people other than John working for them, maybe raptr wouldnt be such a cluttered mess.

    I run it, only for the free junk, but i dont use it other than have it sitting in my tray.
     
  6. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    That depends if skill level is holding back innovation or funding, goals, etc. Same with the driver team. We can just speculate that they aren't very good or lazy but how much time do they spend on the drivers, how many people work on them, is optimization a priority or hardcoding min/max limits to new features?
     
  7. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    i think that improving their driver to perform like nvidia is worth maybe 500 manpower hours and you want to tell me that such company cant hire 10 decent programmers to do such job?
     
    Last edited: Jul 8, 2015
  8. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    They're losing money big time. Their marketshare is nose diving. Instead of focusing on innovation and improving I really don't know what they're doing.
    The Fury X for example. A good card, but priced too high! :D
     
  9. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    We can't deny that there is a big drivers improvement in DX11 in the last months in W10 drivers (and series 300).

    This is true but it's not enough even combined with a more powerfull hardware like Fury X to beat Nvidia 980Ti.

    3DMark DX11 API overhead test results from AMD and Nvidia show us who wins the drivers DX11 performance battle.

    The posted comparision video show us the AMD GPU power is underused due to a bottleneck, the bottleneck is DX11, at some moments the GPU usage is as low as 57%, meanwhile Nvidia GPU is way higher almost always over 90%.

    The Nvidia advantage is so HUGE in the software front it can overcome AMD bettter hardware to deliver better performance and gameplay.

    Maybe the AMD GCN hardware is not the adequate platform for DX11 but DX12.
     
  10. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    It would be really interesting to run some old drivers (like 3 years old) to see the overhead performance compared to 15.6 now.
     

  11. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    yep that will be very interesting :) what is first driver for 290X?
     
  12. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    13.10 or 13.11 most likely.

    They were released Oct 2013.

    It was a year and a half ago, not 3.

    lol
     
    Last edited: Jul 8, 2015
  13. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    I did a comparison test of different drivers with Tomb Raider (3 presets with 3 runs each), and there was no difference in performance from 13.4 up until 15.6. Only the 15.150 and 15.200.1040 have a significant performance uplift. It looks like for quite a while, work on drivers was focused on adding support for new hardware and specific fixes for badly optimized games.

    On the other hand, nVidia was doing the same thing, until AMD presented Mantle. Then all of a sudden they found more performance.
     
  14. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That was clear enough from the results with the newer driver fork, as well as with Windows 10. I have a question though, which might sound like critique, but it is not. There are games like AC Unity, or Assetto Corsa (similar to yours), that don't seem to have the same bottleneck. Is it because of the design of your engine? Are they doing something "less"? Also, since you referred to single thread performance, how is NVIDIA's performance before the "DX11" driver (that went around when Mantle appeared)? Is it sub par like AMD's is now?

    Probably the last one. I have no evidence at all, but I still believe that the Fury X has not been properly "fed" :)D) yet. Not even in 4k.

    That's very nice to hear, thank you! If it is a 15% increase in minimum framerates, then it's huge. It's also funny how the improvement rate is almost half, exactly like the gap that the drivers from each company have in the overhead test in 3DMark.

    If you have the spare time, could you please do that? I plan to run a big comparison between 1040 and 15.6, and that tool would be heaven sent.

    Is it a really monumental task? How is it compared to DX9 to DX11? Does the code you already have for the consoles help at all? Is the end-code efficiency close to console-levels?

    Ok, here's my main question to you. We all know that the XBox One runs Windows and it has a DX11 implementation. That implementation has a driver/translation layer, because you obviously write DX11 code for GCN, and it works as it should. That driver is ridiculously efficient too (otherwise nothing would run properly on the XBone). Who wrote that driver, and if it is not AMD, how can AMD borrow/beg/steal it?

    Funny thing is that I had made the exact same observation. Now that AMD's CPU architecture will be more useful than ever, powerful single cores with HT make a comeback (with Zen). Also on the geeky side: Which one do you bet will be faster at multithreaded apps? An old AMD FX with 8 threads, or a comparable Intel with 4+4HT?

    So, nothing was really being done for one and a half year, until this January I guess. Or they already knew, and in January it was the first leak simply. Something else, yest the 290x was presented a bit more than one and a half year ago, and the 7970 (which is "legacy" for new features), was the top AMD card just 19 months ago. That because I see people speaking about the cards as if they are 5 years old :p
     

  16. Spartan

    Spartan Guest

    Messages:
    676
    Likes Received:
    2
    GPU:
    R9 290 PCS+
    Well, I really hope win10+win10 amd drivers will do something with my cpu, because I'm not able to even play 2015 titles properly with it. Properly means 1080p between 50-60fps for me. With the current state, I'm expecting to drop bellow 10 fps half of the times with medium settings in 2016 titles :S. ~50% total cpu usage is a joke in my opinion in games with this cpu (even when the game supports up to 8 cores...). Yes, you can say the fx8350 is an old cpu, however as far as I know, old i5 cpus can run 2015 titles without any problem. Actually I should have bought an i5 with a compatible mobo, instead of my cpu+mobo+h80i(which is loud as hell) combo, but now I'm f***ed by amd so hard, the only solution will be to drop my cpu (with my mobo...) into the trash can in 2016. Well, when I'll be there, I'll probably throw out my h80i as well. "Advanced 3D gaming" my a**. They can be glad, for that 25% cpu market share.
     
  17. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    problem is single thread bottleneck...if you check process hacker in 90% of games you will see that there is one thread with 12.5% load which is exactly 100% load on 1 core :D

    funny thing is that AMD cpu + AMD gpu is performing 30% worse than AMD cpu+NVIDIA gpu

    after DX12 games will come your cpu be on comparable with i7
     
  18. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    This is so incredible that even the most brave AMD PR couldn't dare to say it!

    :D
     
  19. Spartan

    Spartan Guest

    Messages:
    676
    Likes Received:
    2
    GPU:
    R9 290 PCS+
    Yes, I know that, except this i7 thing, which is a "little bit" erh... nevermind!
    Yesterday I tried Dragon Age Inquisition (supports mantle) above ultra preset, to figure out how this low-level API thing will boost my performance, or eliminate bottlenecks, unfortunately I couldn't benchmark it properly, because my OSD (via RTSS) didn't work with mantle (just with dx11), nor fraps... So, I had to use the in-game fps counter. Well, I actually had some fps boost, maybe 5-10 fps, however it is really hard to tell it exactly without proper fraps benchmark. So, I'm still flying blind with it. Also, I don't play bf4.
     
  20. OneB1t

    OneB1t Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    R9 290X@R9 390X 1050/1325
    Dragon age have integrated benchmark :)
     

Share This Page