Nvidia pressurised Oxide dev DX12 benchmark to disable certain settings

Discussion in 'Frontpage news' started by AvengerUK, Aug 31, 2015.

  1. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Its Nvidia who have ask to disable this feature, not Oxyde who have do it on his own..

    http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1210#post_24357053

    I dont know if this is just a question of Nvidia needing a bit more time for support plenty Async compute on Maxwell2.0 ... At contrario of AMD, this is pretty new on their arch and anyway, no games so far was use it on PC.
     
    Last edited: Aug 31, 2015
  2. HaarKaaN

    HaarKaaN Guest

    Messages:
    10
    Likes Received:
    0
    GPU:
    GTX 1070 FE 2202/9600
    What is Async Compute? Sorry but i do not know so much about the gpu :)
     
  3. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
  4. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,408
    Likes Received:
    423
    GPU:
    Sapphire Fury
    And this is why I went Fury.
     

  5. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,516
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    It's a blessing the consoles got AMD hardware. Otherwise AMD with its shrinking market share might not have had the muscles to push this innovation. Now, due to consoles alone, game studios can't totally ignore it despite the PC market leader NVidia's lack of support for the time being. It would seem reasonable to assume Nvidia would cut the dividends a bit to implement this broadly in their next generation. Who knows if they had otherwise, or how late if they had.
     
  6. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
  7. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    This hardware feature is exposed on DX12, Vulkan ( and Mantle ofc), console API..

    Well, GCN support it since 2011 gpu's ( 7000 series ) .. Normally Maxwell ( 2.0 ) should be the only Nvidia gpu's to support it, but who know exactly how they have implement it, if it working as intended, or if it is just a question of work on their side.

    The real problem is more that DX11 was not support it, and so it have not be used by PC developpers. If it have been exposed earlier in DirectX, im pretty that devs will have use it as soon as they can.

    The funny thing is effectly GCN have implement it a long time ago, even the architecture is completely made with that it mind ( ACE ( Asynchronous compute engine in hardware and DMA Copy engine ) was present since the start.. Look like AMD have a little bit anticipate what will come next, or was thinking it will be used way early, in DX11.
     
    Last edited: Aug 31, 2015
  8. ddelamare

    ddelamare Guest

    Messages:
    224
    Likes Received:
    5
    GPU:
    Inno3D GTX 1070 X4
    Can anyone tell me what would happen if Oxide had decided to go with Nvidia and disabled the async feature for DX 12 for this specific title? How would this translate for AMD and AMD customers?

    Can you also tell me why they would prevent a game from using DX12 feature, which is exactly the type of feature that every gamer in the world is looking for going forward?

    It makes you think about other 'agreements' with other game studios...
     
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Another thread? That's the fifth or sixth thread regarding the AoS benchmark, the third regarding asynchronous shaders and nvidia's gpus not supporting it natively.
    And what does it lead to? Spamming, fanboy flaming, not real gain to anybody. Well done keeping that discussion going on and on and on and on...


    A single dx12 benchmark isn't enough to have a valid statement about the dx12 environment. If nvidia gets owned in each and every game, it's AMDs turn for a change. But it's not nearly sure. It's a single dev's pre-beta game, unfinished drivers, probably an api that nobody really knows how to handle, lots of insecurities at this time. Move on people, and don't forget your tin foil hats. We will meet again once dx12 is available to consumers, as well as the first native dx12 games hit the shelves (not early access games or those with an added patch).
     
  10. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    QFT.

    I don't see why this one pre-alpha bench by a dev with a shady history of helping push AMD's narrative is getting so much play.
     

  11. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    But it is a good start,you know,to make gaming industry moving from obsolete and hidden things to some more open to consumers and tech press.
    Who can do things moving? Some Corporations who hide & make us pay double or triple for some products who are obsolete after 1 single year???

    If we are not discuss and not share and put the truth in their face then what happend when the things will get worse than today?

    Ok its an alpha game but the things behind GCN & Maxwell architecture can not be hidden.
    What if Maxwell or GCN 1.2/1.3 can not support entire DX12 and you buy 1 card,after you are lied by the PR Corporations,and in the end you cannot play at the full speed as they say in the marketing brochure or web press?
    You will sell that card and buy,encore une fois,a card from that Corporation? Nice one.
     
    Last edited: Aug 31, 2015
  12. rl66

    rl66 Ancient Guru

    Messages:
    3,931
    Likes Received:
    840
    GPU:
    Sapphire RX 6700 XT
    personaly, i don't care as none brand have real optimised DX12 right now...

    both lie and try to hide their weakness in this or that point... both have and so? it's like that at each new DX version...
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Theoretically you are right. But practically what we talk about here has no influence on what devs and gpu manufacturers as well as m$ come up with. Or why did you think it too them years to work on HBM, took them years to come up with mantle and / or dx12, and the list goes on with techs that could have been available and used, but never saw broad usage.
     
  14. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    Well all this new tech are come hard to the market because the industry are headed by the grumpy old man you know?
    You see in all techie Corp. from Apple to M$$$ only and only old man who teach us or give us some tech who will became obsolete because that tech is launched too late.And on and on...
    And the same way the standartization is not flexible and not become in time because too many old man are headed us.

    The Things need to move fast because the world is not "staying a while and listen".
     
    Last edited: Aug 31, 2015
  15. poornaprakash

    poornaprakash Active Member

    Messages:
    94
    Likes Received:
    18
    GPU:
    AMD/Nvidia
    I was shocked by such rude, unethical business practices done by Nvidia. It may be there "Successful" business model of eliminating competition in illegal way. But for consumers who had to deal with monopoly is a terrible thing to deal with. Why does such practices get praised by Nvidia fanboys ?? If there is no competition Nvidia fanboys themselves had to pay more and more for there favorite Nvidia hardware with fewer innovations happening. Its just plain stupid to support such business practices of Nvidia that effects the entire consumers not just AMD users.
     
    Last edited: Aug 31, 2015

  16. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Yep Nvidia would be run just like AMD. Hype campaigns with false claims. Lose $140M in one year destroying their investors portfolio all while giving their president and CEO huge bonuses for a job well done.

    Just showing the other side of the argument.
     
  17. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    How many times do I have to say this? Intel and Nvidia need AMD. Stop spreading 'everyone wants to kill AMD' rumors because they're not true.

    And regarding the topic, people like you are trying to hype up stuff like this time and time again, like with the 970 which turned into a huge fiasco even though the cards were performing properly. It took a damn VRAM benchmark to figure out that the RAM was acting weird, go figure. And then came the cowboys running SoM at 4k with ridiculous settings, passing 4GB of VRAM usage and saying "OMG THE CARD STUTTERS". Taaatataaa thank you captain obvious, we all know what happens to cards when they run out of VRAM.

    To refocus, you do not know why Nvidia asked for Async to be turned off. Maybe it's indeed because Maxwell does not natively support it. Or maybe it's because their drivers are not ready for it, and judging by the fact that the driver IS AWARE of it is further indication that the implementation might be a work in progress. Or maybe Oxide's implementation sucks. You don't know. Only their tech specialists know.

    I have no idea why this is getting so much attention, especially considering Oxide is known to be shady. Not to mention the fact that it's just one title. Many of you are quick to judge performance by one single title. By that logic, why don't we use Bioshock Infinite to definitively measure performance? Because it's an Nvidia-favoring title, that's why. Just as Ashes might be an AMD-favoring title. You can't know.
     
  18. INSTG8R

    INSTG8R Guest

    Messages:
    1,659
    Likes Received:
    95
    GPU:
    Nitro+ 5700XT
    Yeah but this isn't about Gameworks or Gaming Evolved even. This is about hardware level features that both sides "supposedly" could do. Oxide quickly figured out that Nvidia couldn't and disabled it and of course in turn Nvidia asked them to disable it.

    Pretty clean cut if you ask me.
     
    Last edited: Aug 31, 2015
  19. kinggavin

    kinggavin Guest

    Messages:
    297
    Likes Received:
    0
    GPU:
    gtx 980TI super jetstream
    personally i would like amd and nvidia and the consoles sony microsoft to stop and just make sure the games run well at launch better optimized smaller size games less patches this is more important , the market competion i would prefer if the focus was on power usage heat and the quality of the product
     
  20. AMD made a new API from scratch, didn't pocket any money from the customers. Then literally gave it away so we can have DX12 and Vulkan. They made Freesync for which they didn't pocket consumer's money as well. And set the ball rolling for Adaptive Sync. Now we get Async Compute that had been sitting on our cards but never really used.

    Whereas Nvidia has been a massive douche, rigging 700 series performance and mobile GPU overclocking. Ruining games with Gamewrecks. Pissing off Linus Torvalds, pissing off Oxide, pissing off their most valuable Titan customers with x80 Ti cards.

    Would you fault me for not wanting to buy Nvidia products?
     

Share This Page