AMD released list of compatible DirectX 12 cards

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 14, 2015.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,319
    Likes Received:
    8,854
    GPU:
    AMD | NVIDIA
  2. IceVip

    IceVip Master Guru

    Messages:
    751
    Likes Received:
    104
    GPU:
    RTX 3080
    Wait so all of them are "fully" compatible?
     
  3. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,098
    Likes Received:
    89
    GPU:
    RTX 2070 Super
    Didn't we already know this? Unless they mean they're 'FULLY' compatible like stated above, this isn't new information is it? Thanks for the update anyway Hilbert.
     
  4. spp85

    spp85 Member

    Messages:
    12
    Likes Received:
    0
    GPU:
    Radeon HD 7950 Vapor-X OC
    Yes finally.....

    Finally all Radeon HD 7000 series and above GPU are Dx12 confirmed :banana:. Free upgrade to all 3 year old AMD GPUs
     

  5. Undying

    Undying Ancient Guru

    Messages:
    14,890
    Likes Received:
    4,002
    GPU:
    Aorus RX580 XTR 8GB
    I just hope Tahiti owners dont end up like having VSR support (never).
     
  6. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,339
    Likes Received:
    2,684
    GPU:
    MSI 6800 "Vanilla"
    I'd assume this is only "basic" compatibility and the earlier posted charts and such detailing the actual feature levels of DirectX 12 are fairly accurate with how the older Radeon GPU's supports the standard DirectX 12 features but only the R9 290/290X (And 285?) supports the second D3D12 tier.
    (And currently only the newest Nvidia GPU's support the third and final tier of features although I don't remember if there were anything special to these upper tiers besides additional tiled resources functionality.)
     
    Last edited: May 14, 2015
  7. geogan

    geogan Master Guru

    Messages:
    798
    Likes Received:
    148
    GPU:
    3070 AORUS Master
    Will any future PC games be able to fully take advantage of it though at all, considering they have to also release the same games on console and the console is stuck on DX11 or are the consoles going to get an upgrade to DX12 too?
     
  8. MerolaC

    MerolaC Ancient Guru

    Messages:
    3,438
    Likes Received:
    300
    GPU:
    MSI 5600XT G. MX
    Joke's on you, because, we do.
     
  9. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,622
    Likes Received:
    620
    GPU:
    MSI 2070S X-Trio
    It doesn't say they are 'FULLY' compatible, it just says they are compatible.
     
    Last edited: May 14, 2015
  10. WareTernal

    WareTernal Master Guru

    Messages:
    248
    Likes Received:
    39
    GPU:
    PNY RTX 3070
    I think this guy summed it up beast ATM:
    "currently no card has full DX12 support, and that DX12 support doesn't just go linearly "higher tier is better", but rather the different features have separate tier systems. That is, currently top Maxwell is tier 3 in "tiled resources" and tier 2 in "resource binding", while newest GCN cards are tier 3 in "resource binding" and tier 2 in "tiled resources"." - TunaFish2
    Not here to plug another website, but beyond3d.com has some more(fairly technical) info about the DX12 tiers.
     

  11. waltc3

    waltc3 Maha Guru

    Messages:
    1,210
    Likes Received:
    388
    GPU:
    AMD 50th Ann 5700XT
    Most of the outstanding features of DX12 would seem to have to do with using current CPU & GPU hardware to a far greater degree than it has been used in the past--using multicore GPUs & CPUs much more efficiently than was possible through DX11/OpenGL 4.x. As such, having to have new hardware for DX12 is not going to be as critical for support as it was in the past, I'm guessing...however, I'm peripherally concerned with what happens to GPU/CPU clocks & temps when suddenly a program or game is flexing more circuitry per clock than has ever been true in the past. Should be interesting to see how much of this shakes out & how much is just PR...

    I note that for the first time the HD 5000 series has fallen off the list and will be led out to legacy-driver pasture...This certainly makes sense as the thermal boundaries in the 5000 series were stressed to start with--I put my old 1GB HD 5770 in a "new" desktop build for the wife recently after her abominable laptop gave her one too many problems (she's privileged to get my hand-me-downs...;)) and I was surprised to see how much hotter it runs than my 2GB HD 7850...it *idles* at 75C and runs all day flat out @ 100C!...;) The little fan on the 5770 makes a noise like a midget buzz-saw when it hits full power...:D I had forgotten all of that (my HD 7850 clocked to 1GHz idles @ 28C and rarely runs hotter than 70C under full load and most of the time I can't hear it)...! Both GPUs run with stock cooling, and ATi says that 100C under load is A-OK for that little 5770 GPU--can do that 24/7, according to ATi. Must be true as I used the card flat out for a couple of years, IIRC, with nary a hiccup--and it is still going strong. If DX12 tried pull even more out of that little GPU I don't think it would last long...

    Edit: Looking at the list a little better I see that the HD 6000's didn't actually make the cut, either...Hmmmm....
     
    Last edited: May 14, 2015
  12. Cor9012

    Cor9012 Member

    Messages:
    15
    Likes Received:
    0
    GPU:
    AMD 7970 GHZ Edition
    Are you sure. I have the 7970 and no such option at all on 15.4 beta drivers.
     
  13. Bansaku

    Bansaku Member Guru

    Messages:
    159
    Likes Received:
    7
    GPU:
    Gigabyte RX Vega 64
    Yes! Can't wait to see how much more life my HD7950s have left in them.

    :banana:
     
  14. Aura89

    Aura89 Ancient Guru

    Messages:
    8,141
    Likes Received:
    1,266
    GPU:
    -
    Xbox one console has always been DX12, technically. I believe they are getting a "DX12" update, but that's not really gonna do much for them as most everything that is nice about DX12 is already included in the console

    Either way i don't fully understand your question, Xbox 360 was DX9 (though custom compared to our DX9 and had features of DX10 as far as i know) and yet we had DX10/11 games even so, so...?
     
  15. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    Doh. What do you mean "fully compatible" - does it mean every optional feature with all the maximum limits and tiers? Nothing is fully compatible then, and nothing ever was.

    There is no "second D3D12 tier" or any "D3D12 tier". There are separate resource binding tiers, resource heap tiers and tiled resource tiers, as well as other optional features like Rasterizer ordered views and Conservative rasterization. Some of these are required on particular feature levels 12_0 and 12_1.

    Yes. Xbox One is getting Direct3D 12 and both GCN 1.1/1.2 cards and Xbox One are feature level 12_0, as they support tiled resources tier 2 and resource binding tier 3 which a superset of binding tier 2.
     

  16. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    No, it wasn't.

    It's been Direct3D11 since the start with some elements like bundles and resource fences that were added later in a "monolithic Direct3D11 runtime" which combines AMD WDDM driver with the Direct3D11 libraries into one module, eliminating unnecessary generic code paths.

    Direct3D 12 is a complete overhaul of resource management and so it bring much more changes. For compatibility there will be a "11on12 layer" which will emulate Direct3D11 behaviors and interfaces but will run on top of Direct3D 12.
     
  17. Aura89

    Aura89 Ancient Guru

    Messages:
    8,141
    Likes Received:
    1,266
    GPU:
    -
    I misspoke by saying it's been DX12 technically, but my latter half, is true, it's not going to do much for the xbox one, as it has already had low level API, which is one of the biggest things for PC, but will do pretty much nothing (performance wise) for the xbox one, and microsoft has stated that themselves

    Will it add features for developers and make certain things easier for them? sure, but consumers won't notice much
     
  18. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    It's already on Wikipedia with a fairly technical bits of info:
    http://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels

    Nah.

    For AMD cards, there is no practical difference between feature levels 11_1 (GCN 1.0) and 12_0 (GCN 1.1/1.2). GCN 1.0 cards only lack Tier 2 Tiled Resources, but Tier 2 can be approximated on Tier 1 hardware with some additional shader code. Otherwise, the supported feature set is identical, and all GCN chips also support the highest Resource Binding Tier 3 - where there are no practical limits on the size of most important descriptor tables (i.e. fully bindless resources) - and typed UAV loads for additional texture formats.


    As for Nvidia, everything below Maxwell 2nd gen is level 11_0, but that's not really too far away from levels 11_1 and 12_0 either, since Kepler/Maxwell-1 support partially bindless resources with Resource binding Tier 2, as well as Tiled resources Tier 1 (which again can approximate Tier 2 with additional shader code) - though they don't support typed UAVs for additional texture formats, but do support typed UAV for the three basic formats defined in Direct3D 11.0.

    Level 12_1 has some interesting additions and in the future every card should support it, but for now it's only Maxwell 2nd gen (GeForce GTX 900 and GTX Titan X).


    Intel Haswell/Broadwell is feature level 11_1 but at the lowest Resource binding Tier 1, and Skylake is level 12_0.


    In other words: any existing feature level 11_0 card is just as good in Direct3D 12 as level 11_1 or 12_0 cards. Supporting any particular feature level does not guarantee any performance benefits.
     
  19. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    It didn't. Again, it was the "monolithic Direct3D 11 runtime" which is a rewrite of the generic Direct3D 11 runtime around the AMD graphics driver code, which should be much more "close to metal" than the first-gen generic Xbox runtime built around standard WDDM interfaces and generic WDDM graphics driver.
     
    Last edited: May 14, 2015
  20. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    3,582
    Likes Received:
    298
    GPU:
    2080 Ti FTW3 Ultra
    And it still won't have a low level API, you chewed me out for saying that. It's a lowER level API, aint that right. lol
     

Share This Page