1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Rumor: AMD cancel RX Vega Primitive Shaders from drivers

Discussion in 'Frontpage news' started by NvidiaFreak650, Jan 23, 2018.

  1. NvidiaFreak650

    NvidiaFreak650 Master Guru

    Messages:
    259
    Likes Received:
    93
    GPU:
    RX Vega 64 8GB HBM2
    After CES2018, AMD told people in a breakout session, that they cancelled the implicit driver path for Primitive Shaders, they are now offering only explicit control to developers who wishes to implement it. So Primitive Shaders will only be ever implemented if a developer properly codes for it. The driver portion has been cancelled.

    This comes directly from Marc Sauter (y33H@), editor at the german IT site Golem.de.

    Source: https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11610696#post11610696

    Source 2: https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11611522#post11611522

    Source came from: https://www.reddit.com/r/Amd/comments/7sedpq/amd_cancels_the_driver_path_for_primitive_shaders/

    If this true Lawsuit in coming from those who own RX Vega GPU.
     
    warlord likes this.
  2. -Tj-

    -Tj- Ancient Guru

    Messages:
    15,168
    Likes Received:
    648
    GPU:
    Zotac GTX980Ti OC
    Idk, it is shaddy no denying that..

    But then again Nvidia promoted async compute with maxwell and you saw how that turned out..
     
  3. warlord

    warlord Maha Guru

    Messages:
    1,295
    Likes Received:
    241
    GPU:
    R9 390X MSI(Gaming)
    Well no performance loss, but there will be no performance gain either. Vega owners are ****** as fury owners did after years. A wannabe enthusiast GPU "FINEWINE" edition. Many promises and useless features all over the place. AMD is a superior hype race officially. Neither Intel nor Nvidia lied so much ever before. Look at ryzen and vega slides. Then look at real performance numbers and satisfaction values. AMD chose the wrong way. They are gonna lose all RX and Ryzen users faster than the period took to acquire them. RIP.
     
  4. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    11,856
    Likes Received:
    129
    GPU:
    Sapp. RX Vega 64 LC
    Well the devs won't implement them, as AMD hardly have a Market Share, if any at all now.
     

  5. Denial

    Denial Ancient Guru

    Messages:
    11,430
    Likes Received:
    415
    GPU:
    EVGA 1080Ti
    Maxwell supports async compute, it's implementation just wasn't as robust as AMDs specific definition, so I'm not sure how that relates.
     
    Last edited: Jan 23, 2018
  6. NvidiaFreak650

    NvidiaFreak650 Master Guru

    Messages:
    259
    Likes Received:
    93
    GPU:
    RX Vega 64 8GB HBM2
    People have the right to get mad when companies do things like this behind customers back. like what Apple did to slow down your older phone without letting people know.

    AMD had 6 months to let everyone know but failed to do so.
     
  7. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    8,674
    Likes Received:
    906
    GPU:
    1080Ti @h2o
    Well, here I was already thinking "another fanboy bashing for nothing" (sry @NvidiaFreak650 ).
    Then I read up on what primitive shaders even are, and what they are supposed to do.

    Then I don't understand, why are they cancelling it? Didn't it work before? In theory that's something really useful...
     
  8. Denial

    Denial Ancient Guru

    Messages:
    11,430
    Likes Received:
    415
    GPU:
    EVGA 1080Ti
    They aren't canceling it - the developer can still manually support it but initially AMD's driver was supposed to automagically replace the geometry shaders when it felt it could provide a benefit to geometry performance. It probably led to flickering/image artifact issues that they couldn't resolve in the automated process. So now the developer has to decide when to use it - which becomes a question of support, if the PS4 Pro and Xbox One X support primitive shaders then it's possible we will see some games with it. If it's just desktop vega, I doubt we'll see much of anything - that being said is geometry performance really that much of a bottleneck with Vega? I genuinely don't know.
     
    Evildead666, Silva and fantaskarsef like this.
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    8,674
    Likes Received:
    906
    GPU:
    1080Ti @h2o
    Again, thanks for explaining.
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    15,168
    Likes Received:
    648
    GPU:
    Zotac GTX980Ti OC
    Disabling it specifically for Maxwell or to have negative performance when enabled sounds about right.

    My point was it's nothing new about promoting certain stuff then ditch it for inefficiency or compatibility reasons..
     

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    13,789
    Likes Received:
    356
    GPU:
    Nvidia Geforce GTX 960M
    This is a little different...Not right what Apple of course, but this was something AMD promised. Like that was a feature people invested in Vega for, and now they aren't going to get it. AMD Made their money with mining, and see how well Vega does in mining so of course they aren't going to go through with this anymore. Basically people bought a product that wasn't finished, beta, early access, whatever you want to call it. Instead of releasing the product fully complete like they should of done and for some reason market trends seem to agree with this model, they in turn release a half ass final product now.

    So it would be totally different if Vega competed without these shaders, but they hardly do...That on top of the outrageous price due to mining it really is clear to see where their priorities are. Which they are a business, so I'm not surprised that they want to do anything to turn a profit.
     
  12. Alessio1989

    Alessio1989 Master Guru

    Messages:
    938
    Likes Received:
    60
    GPU:
    .
    no API support, no party. Hopefully in DirectX 13 ('pardon, DirectX 14) and Vulkan 2.0 there will be a unification of the geometry pipeline....
     
  13. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,962
    Likes Received:
    619
    GPU:
    -NDA +AW@240Hz
    There are quite a few wrappers/injectors for DX games. I guess those people can put this flag and override developer's choice.
    Or even enable it for older games.
     
  14. RealNC

    RealNC Ancient Guru

    Messages:
    1,990
    Likes Received:
    412
    GPU:
    EVGA GTX 980 Ti FTW
  15. Denial

    Denial Ancient Guru

    Messages:
    11,430
    Likes Received:
    415
    GPU:
    EVGA 1080Ti
    DSBR is active on Vega FE but only utilized in some pro applications - I haven't really heard about it's status on RX Vega cards.

    DSBR isn't really a catch-all performance/power increase/decrease (respectively) like lots of people believe. Nvidia's Tom Peterson was interviewed on PC Perspective about it and basically said they enable it on a profile basis - it actually decreases shader performance, so it's only useful when memory bandwidth is a bottleneck and not shader performance. It's also known to cause graphical issues in some games/applications.
     

  16. Alessio1989

    Alessio1989 Master Guru

    Messages:
    938
    Likes Received:
    60
    GPU:
    .
    Why should them? Developers will never care about such huge difference in the shader pipeline just for one architecture (Vega) on one platform (PC).
    Developers care more about features that are available on consoles and that could or will available on many architectures on PC like the features coming with Shader Model 6.1 and 6.2.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,962
    Likes Received:
    619
    GPU:
    -NDA +AW@240Hz
    I am not writing about game developers, but about people behind software like re-shade or other wrappers/injectors which can override anything developer of game implements or does not.
    I have seen wrappers adding shader code to old DX7 games, making them effectively DX8.1.
     
  18. Kaarme

    Kaarme Maha Guru

    Messages:
    1,163
    Likes Received:
    173
    GPU:
    Sapphire 390
    Intel better keep Koduri on a tight leash so that future Intel iGPU's aren't burdened with features that won't work and need to be quietly forgotten behind the scenes.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    11,430
    Likes Received:
    415
    GPU:
    EVGA 1080Ti
    To be fair these features, both DSBR and Primitive Shaders require a lot of software development work, testing, etc. We have no idea what went on behind the scenes as far as RTG's budget and what got allocated towards the Zen project in the last few years. Typically it's about 2-3 years from initial design of a GPU architecture to the final shipping product. So Raja most likely planned these features with an expectation that he would have a software team capable of delivering them and then who knows what happened.

    I'm not even sure what RTG's way forward is at this point - I think AMD needs to be firing on all cylinders for it's CPU division to keep Zen competitive, especially now that Intel is tripping up. On the GPU side, I don't think they have a chance at beating Nvidia. I think their polaris strategy of just targeting the masses with mid/low tier cards at extremely competitive price points is really the only way forward because it's extremely safe. I know people are hoping for a crazy MCM setup on the GPU side but I think the engineering effort both software/hardware is not something AMD can afford right now, if anything goes wrong and the product is a failure it's game over.

    It's going to be interesting to see how it all plays out.
     
    fantaskarsef and yasamoka like this.
  20. user1

    user1 Master Guru

    Messages:
    890
    Likes Received:
    127
    GPU:
    hd 6870
    It aint over till the fat lady sings,

    that being said this is not entirely surprising,
    implementing them implicitly would be an insane amount of work many things can go wrong.
     

Share This Page