High DX11 CPU overhead, very low performance.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by PrMinisterGR, May 4, 2015.

  1. Doom112

    Doom112 Guest

    Messages:
    204
    Likes Received:
    1
    GPU:
    MSI GTX 980 TF V
    This awareness was reported back in 2014 April but no improvement happened.AMD biggest problem is ignorance and Red have even bigger ignorance attitude which will lead to no where.

    Example is AMD roy where as he beehive that there is nothing to worry about.
     
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Do you have any links for threads/tweets about it?
     
  3. Doom112

    Doom112 Guest

    Messages:
    204
    Likes Received:
    1
    GPU:
    MSI GTX 980 TF V
    Yes there was but i cannot find it now.

    Someone posted the digital foundry video of GTA V which was tested on R9 280 and GTX 750 Ti and there was clearly a bottleneck but AMD roy said that i cannot see any problem there and that was it.


    AMD Roy tweets are mostly trolls and nothing interesting and i wonder why AMD promotes them.
     
  4. xacid0

    xacid0 Guest

    Messages:
    443
    Likes Received:
    3
    GPU:
    Zotac GTX980Ti AMP! Omega
    Can you post this in OCN/LTT forum and tag Thracks? :p
     

  5. semitope

    semitope Guest

    Messages:
    36
    Likes Received:
    0
    GPU:
    iGPU
    Seems overblown. Though I feel it for the people who buy the dual core and bottleneck their AMD cards, they still have a solid card. A faster CPU won't get you better performance from a 750 Ti, while the AMD card allows you to keep it for more years and upgrades help its performance.

    The comparisons made are not adequate. There need to be more nvidia cards and AMD cards. A 750ti is not a better option if its slower with better hardware, all things equal. The real things to consider would be cost and power consumption put up against performance.
     
  6. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    Edited
     
    Last edited: May 12, 2015
  7. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    PrMinisterGR

    Great write-up, read it all, hopefully this thread sticks, and I hope some tech-sites take it and run the story and create awareness.

    You know what, I'm going to take a wild guess here and say when the 380x & 390x will be released they're going to release exclusive drivers that allow "Command Lists" only for the new hardware and claim past hardware cannot do it. I think if something like this happens it will be the last nail in the coffin for me. After over a decade of continuously buying AMD GPU's and recommending them I might make the switch. Again thanks for writing this up.

    EDIT:

    To all programmers out there; there's a feature in RadeonPro called "Spoof Video Adapter" which makes the games think it's a certain videocard, rather than the one you're currently using.

    Is it possible to create a program that makes the AMD Drivers think that the GAME I am running is in fact ANOTHER game. For example I'd run GTA 5, but make the AMD Drivers think I am running Civilization 5 (which is the only game that we know of that received special treatment from AMD) and then benchmark the game with lower-resolutions, higher resolutions and different CPU's and see if anything different happens.
     
    Last edited: May 4, 2015
  8. thatguy91

    thatguy91 Guest

    As I said in other threads, it's likely full development of the drivers will be focused on Windows 10. This makes sense, seeing as Windows 10 is being actively pushed to existing Windows 7 and 8.1 users. Nvidia and AMD are unlikely to be willing to support people who are stubborn/reluctant to change to Windows 10. This isn't to say that Windows 7 or 8.1 drivers still won't be released, just that they'll be more workstation based drivers.

    The thing with providing features for older cards, even if a GPU is rebranded, you need to remember that it is as the very least a new stepping of the GPU where the new feature is supported.

    Unfortunately recently, both AMD and Nvidia have been 'let down' by the process tech, which has affected their releases and release cycles. It makes you wonder how many GPU designs they've scrapped that were meant to be on the 20 nm process (which we already would have had now). Sure some of these may have been like a tick/tock (similar GPU but on a smaller process), but that is rather irrelevant now. Just hope the 14 nm process doesn't face the same issues!
     
  9. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    That Spoof Video Adapter literally does nothing, Games can detect your video card properly. That feature only had use in just one game, Spiderman 2. And try renaming GTA V's executable to that of Civilization 5's executable, but It's likely that it won't yield any better results.
     
  10. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    That's why I'm asking if we can have a proper work-around so that we make the DRIVER believe that we actually are running Civ5 when we are running other games.

    EDIT: I'm aware this probably requires edited drivers too that ONLY have the Civ5 Profile, and try to apply it to everything. I'm not sure if we're allowed to do this here anyway, that's why I asked if it's possible, only for testing purposes.
     
    Last edited: May 4, 2015

  11. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    Profiles and optimizations made for games are different. For example if I rename ACU.exe to BF4.exe I get weird artifacts in the game. What you are saying isn't going to yield anything conclusive.
     
  12. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    Thanks for the reply. Yep like you said, perhaps even if it works, running games with other game profile will ultimately change the way it looks and performs, so the results won't be comparative or conclusive.
     
  13. Yecnot

    Yecnot Guest

    Messages:
    857
    Likes Received:
    0
    GPU:
    RTX 3080Ti
    Troll? :infinity:

    This isn't about support for older cards, its about support for older APIs. If that's better for them to attempt on Win10, thats fine. Nvidia did it on Win7 and 8.1 though with that hyped 337.50 :eyebrows:
     
    Last edited: May 4, 2015
  14. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    The only games confirmed to support DX11 multithreaded rendering, or more specifically, "deferred contexts" are Civilization V and its expansions. Far Cry 3 supposedly supports it, but it was disabled by default, requiring you to manually change it in the config file.

    The Battlefield games don't support deferred context, the technical director has even gone so far as to say that deferred contexts in DX11 is useless


    Regarding DX11.3 and DX12, DX11.x will continue to be supported for quite a while. Even though Win10 is a free upgrade initially, it'll still take many months until enough people have upgraded. DX11 will also still be used for relatively undemanding games, so they'll work on DX10 hardware, as well as the DX11 hardware that won't support DX12
     
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    The 270X is nowhere near double the 750Ti. Not even close to double. It's somewhat faster, but that's all. 260X is slower than 750Ti.
     

  16. macmac9

    macmac9 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Asus 280X
    Talking of windows 10, the wddm2 drivers(currently) provided with it seem to reduce draw call cost (in dx11) quite a bit, helping a lot in driver bound areas. Below is a test I made with Project CARS a month ago:
    And as a zero poster can't put up an image, so basically the average fps increased 26% maximum increase was close to 50% and it was a mixed bound situation some part cpu and some part gpu limited.
    Image at: http:\\i.imgur.com\d7dfO8t.png (turn the backslashes to "frontslashes")
    Only problem is the drivers don't seem to be production quality just yet (expected)
     
  17. Vbs

    Vbs Guest

    Messages:
    291
    Likes Received:
    0
    GPU:
    Asus Strix 970, 1506/7806
    PrMinisterGR, thanks for the very good summary on the state of affairs in the AMD camp. :)

    I agree with your view, and I personally think Mantle was the biggest Poker move AMD made in years.

    What do you do when you are falling back on the drivers race? You change the game. If you can't win the race, you stop racing. And that's just what they did.

    Mantle started a cascade of events. First, it hit two birds with one stone, leveling the playing field: Enormous gains in performance while leaving most driver-level responsibility in the hands of game engine programmers. "Yes, you have to program the hardware at a low-level now, but look at that performance!". (AMD has quite some history in leaving to 3rd parties to implement stuff nVidia does in-house. Sad. :( )

    Of course, the stars were aligned with the launch of Windows 10. AMD knew MS would want to call and raise on DX12, to capitalize on all Mantle benefits and stop it from gaining traction. MS will also want most game engines to adopt DX12 as fast as possible, to push Windows 10 adoption and squeeze any leftover performance from an underpowered Xbox One.

    AMD threw the bait and MS ate it. Well done! :3eyes:
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    The problem is that if you don't have a top-end CPU, you won't get what you bought with AMD. While NVIDIA gives you almost the full performance of their hardware with an i3. The additional problem is that you have more expensive cards performing worse than cheaper cards, all things equal.

    The images don't seem to be loading due to a certificate error here. Any chance you might link to the original article?

    The whole VSR thing made me believe that that's what they are going to do. Old schoolers will remember what happened with SSAA in the days of yore, when basically they said "Because we don't have enough financial incentive", as if their client's good will is not incentive enough. There is someone who is stupid in there, and we have to find out who they are.
    As for the game, I'll try to see what DX Cap viewer reports when I change it's .exe name to CivV. My belief is that nothing will change, but we'll see.

    I don't mind about the development being focused on Windows 10, at all. I also use a Linux rolling release and it feels to me that if you use a piece of software, using the latest version is the logical thing to do. The issue is that the bottleneck in the 3DMark test is in Windows 10. About the other issue you mention, the best thing to ever happen to the 7970, was the R9 280x :D
    They now have to support people who buy them today.

    It is useless because half (back then) the GPU market didn't support Command Lists. The problem with these features was that they were part of the optional subset of the DX11 spec. Meanwhile, oh the surprise, the company that gave the resources to support the optional multithreading options that games don't use has a 30-100% better driver efficiency with multiple cores under said API, no matter the game.

    Many people believe that DX12 is some kind of magic pill, or that because it is supported by all the main engines, it will somehow magically solve performance problems. Truth is that DX12 (like all low level APIs) move the work from the driver developers to the game developers. That worries me more for the future, believe it or not. That is also the reason for Microsoft introducing DX11.3 along with DX12.

    That might be just a profile for the game in the drivers. Could you rename the .exe of the game and run the tests again please? :)

    The 260x is the same/faster than the 750 Ti in all the games when they are both paired with a fast CPU, as you can see in the screenshots in the beginning. The R9 270x is LITERALLY double the card of the GTX 750 Ti:
    [​IMG]
    Source: GPUBoss, Anandtech.

    It was exactly like that actually. What I'm afraid is that they'll leave their DX11 drivers to rot :/
     
  19. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    It would really be interesting if renaming a games executable file to CivV improve or decrease the performance of a game though (using weaker CPU's).
     
  20. semitope

    semitope Guest

    Messages:
    36
    Likes Received:
    0
    GPU:
    iGPU
    Not seeing the issue. Compared to the stock 960, the results are reasonable. There is also the consideration of which games favor which brand. Eg. watch dogs doing better on nvidia would be expected.

    You're really just throwing around benchmarks without proper consideration of potentially relevant factors. Unless this issue is actually singled out for testing, all the posts are irresponsible at best and propaganda at worst.

    that is very much not true. Considering your own posts, both brands' higher end cards suffer immensely with the i3. Only the bottom of the barrel 750 ti is not bottlenecked (expected) and buying that vs a stronger card is silly unless you intend no upgrades in the future (or plan to sell). The main issue I saw was minimum fps since averages ended up mostly similar. One would simply assume the i3 is only able to get that average fps out of any GPU.

    This interpretation of the results is not logical. The 750 ti is not bottle-necked by the CPU, so it would not lose as much performance from its already lower performance numbers vs a GPU that is not bottle-necked. Saying it loses little is then just not relevant. The same drops in performance are seen with nvidias own higher end GPUs when bottle-necked by the i3. At best you can say AMDs cards have worse minimums when bottle-necked.

    This thread should be tossed as it will mislead users with faulty logic in favor of nvidia. Reads way too much like russian style propaganda.
     
    Last edited: May 4, 2015

Share This Page