Discussion in 'Frontpage news' started by (.)(.), Sep 1, 2015.
AMD: “There’s no such thing as “full support” for DX12 today”, Lists Missing DX12 Features For FuryX
*Grabs a coffee
OT, what these gpu vendors up to:3eyes:
Marketing and hyperbole from both sides.
Infairness to AMD they showed the Fury as DX12 tier 2 (12_0) support so clearly it's not 100% DX12 compliant since it needs to be tier 3 (12_1) to be 100% compliant, i was kinda surprised that they didn't have it as tier 3 (12_1) but i guess they didn't want it to backfire if x doesn't work or perform like it should.
Which you can see here. So they have not advertised it as 100% compliant unlike their competitor.
Why is this news? We know it for very long time. Any kind of 'news' like this will not remove nVidia's problems.
And that comes from 2 simple things in regards of supported features:
- does feature improve visuals?
- does it improve performance?
- > and as result: is visual change worth performance impact? Or does it improve both visuals & performance?
So, you can take all 'new' nVidia's features which AMD did not implement (yet?), and see if you want those in games. Then, you can do same for AMD having stuff nVidia misses.
But since DX12 is mainly for purpose of removing different kind of bottlenecks (includes scheduling?), features which matter are ...
Yes, it seems the claim on Nvidia's side seems they did what was required to scrape in at 12_1, just for marketability sake. Performance wise it means nothing since you don't have asynchronous shaders, and especially if the following is the norm:
That was my reply in response to someone showing the performance scaling of DirectX 12 on various CPU's. Regardless of anything else, DirectX 12 should be faster than DirectX 11 right? Not significantly slower at 1080P High quality? People buy one brand of GPU over another over a difference between 50.1 fps (DirectX 12) and 58.4 fps (DirectX 11). Sadly, that's not around the wrong way!
I am not surprised by this DX12 is new they had to have main GPU providers on board to get wide spread support of api.
Actual Feature support will be decided when we have more games and not only a AMD biased benchmark. NVidia have on there web site in writing Maxwell 2 has Async compute it may just work differently and need some sort of driver to get it working. Only time will tell.
this will be the same as DX11 some games will run better for a particular brand it depends on the way the games been made.
Classic stuff. Marketing hyperboles from both sides. It's really too early for DX12. It doesn't matter too much as we will have to wait at least 1 year to see some DX12 games out and new GFX cards from both companies. Then we can talk and compare but now it's really early.
When I bought my 980 a few months ago, I was said by the retailer that my card had full directx 12 support. Surprise!
We know that since a long time, it is writen even on Fury spec... better late than never, thank AMD lol
it has real DX12 support but not full (you know the new crap .0 .1 .2 introduced by M$. lol)
at least amd is honest.. they dont cheat their customers
Thanks for sharing this.
From Hilbert article:
"It's all marketing mud-fighting and attacking each other these days in-between Nvidia and AMD. Fun fact: on the AMD GPU Tech day for Fury, I myself literally confronted and asked about the DX12 supported feature levels to Hallock, and in this case Hallock himself absolutely refused to give a valid answer at the time as he very well knew that AMD would not fully support DX12 either."
Ofc its a marketing thing/war.
nvidia from I read these years its all about lies&lies all over.Now Amd tell the truth about Mantle development and about DX12 support in their cards.
DX12 its like a baby,you know,still learn how to walk ,how to work (at this point).
so how well does the fury overclock "overclockers Dream"
I waited to buy a Fury X or a 980ti i would say AMD where dishonest about Overclocking ability
after the "Honest" word you forget "this time", i remember the argument that make me buy a HD7850... the worse bought i have done (btw still have it as no one want it... only 3 week of use).
you are naughty
i tought every cgn card has resource binding technology on them and thus they are fully tier 3 cards
^ That table was discredited. Originally from wikipedia but some idiot kept editing the table to indicate pascal has no async shader. Was eventually deleted from wiki.
it s the same with every new version of directx...
both actual card from AMD or Nvidia haven't full support of the new directx :/
We must wait the new generation to have full support.