AMD confirms GCN, incl. XBO, doesnt support DX12 12_1

Discussion in 'Videocards - AMD Radeon Drivers Section' started by niczerus, Jun 4, 2015.

  1. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,558
    Likes Received:
    222
    GPU:
    RX 580 8GB
    FL 11.0 DX12 will give us the most benefits. The rest is just icing on the cake.
     
  2. WarDocsRevenge

    WarDocsRevenge Master Guru

    Messages:
    294
    Likes Received:
    0
    GPU:
    Fury-x Crossfire
    it seems amd supports it all while nvidia has limited support
     
  3. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,643
    Likes Received:
    98
    GPU:
    XFX RX 480 RS 4 GB
    I think there is a lot of confusion of this Tier level, and feature level etc. From what it looks like, GCN hardware don't necessarily fit neatly into a specific tier level. It's as if GCN 1.0 is half way between feature level 11_1 and 12_0. So, you say, well then it's not DirectX 12. That is untrue, since DirectX 12 includes feature levels 11_0, 11_1, 12_0. and 12_1.

    So, even if it is feature level 11_1:
    GCN is still DirectX 12!

    From what I understand of it, all DirectX 12 games will have to fully support all levels of DirectX 12 below the level that the game is written for. So, even if a game is feature level 12_1, it has to fully support level 11_0. The only difference being that the extra features won't be made use of.

    As long as the card supports at least feature level 11_1, it won't be too disadvantaged. I'm not sure of the benefits of the extra capabilities of feature level 12_0 and 12_1 bring, it probably won't make a huge difference, if at all, unless you are looking at more specialised applications.

    The only DirectX 12 cards that will probably be measurably disadvantaged is Geforce 400-700 series cards. Of course, this is all guesswork until Windows 10 is released, the drivers are fully developed, and proper DirectX 12 games are released. It's probable that, at least for the next couple of years, that DirectX 12 games won't fully utilise DirectX 12 features anyway. What this means is, there really will be no difference in the capabilities of any feature level DirectX 12 card, at least for the foreseeable future.

    I just hope they aren't extra lazy and still release DirectX 11/11.1 games next year. This is especially true since practically anyone playing games will be running Windows 10, and even if the developers were too lazy to do DirectX 12 they could at least do DirectX 11.3 (the much overshadowed Windows 10 DirectX). Realistically it should be that games next year will require Windows 10, there is really no valid reason for a gamer not to use it. Being stubborn isn't a valid reason! The whole workstation argument is invalid since workplace computers aren't really meant to be used for playing Battlefield 5 etc.

    I thought I'd better edit this to point out that by Battlefield 5, I am of course referring to the 2016 timeframe, as suggested by the point being made :). So yes, it's not out yet, because it's not 2016!
     
  4. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,558
    Likes Received:
    222
    GPU:
    RX 580 8GB
    Be much more beneficial to support the entire range of GPU's that are FL 11.0+ than just focusing on the small amount compared on special features.
     

  5. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,523
    Likes Received:
    522
    GPU:
    Inno3D RTX 3090
    That's getting hanged by an overpressed driver team, the other one is very serious, and they have been adamant from the start that GCN gets "full" support, as they call it.

    It's only logical, they basically designed the API.

    In layman's terms:

    Feature Level: What can I do.

    Resource Binding Tier: How can I do it.

    Maxwell 2.0 is a bit better on the first part, and GCN on the second part. That's it. There will be nothing exclusive to neither, since in the end DX12 is one spec. There is no DX12.1, that's just developer lingo.
     
  6. juzz

    juzz Member

    Messages:
    37
    Likes Received:
    1
    GPU:
    MSI 1080TI Sea Hawk EK X
    So TL;DR;

    12_1 won't matter much untill "next-next-gen" consoles coz all PC sees are console ports :banana:
     
  7. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    Again, basic Direct3D 12 features - those relating to resource management and fully/partially bindless resource model, reduction of CPU driver overhead and ExecuteIndirect, and GPU/CPU synchronization with multiple engines and explicit multi-adapter - will directly apply to any compatible card from AMD, Nvidia and Intel, regardless of supported feature level.

    All of these cards also basically conform to feature level 11_1 for the most part, though for older Nvidia cards it's technically level 11_0 with a few optional features from 11_1 - but these cards make ~70% of Direct3D 12 cards.

    So anything related to rendering quality mostly stays the same - but it will be possible to draw things faster and/or use more objects and resources.


    So to recap. For AMD GCN cards, the difference between level 12_0 and 11_1 is minimal, since all of them support typed UAV loads and the highest RB tier, and the difference is only in the tiled resources tier 1 or tier 2.

    For Nvidia, the difference between 11_0 and 11_1 is minimal as well, since 11_0 cards support all the important features from 11_1 - except increased UAV slot count and maybe (???) UAVs at every stage.

    Intel CPUs conform to levels 11_1 and 12_0 (???)- but they additionally support ROVs, a feature required on level 12_1.


    So again the difference between 11_0, 11_1 and 12_0 is minimal from the practical point of view, and the game developers should be able to easily scale up or down to fit the capabilities of particular hardware.

    Yes, ROVs and conservative rasterization in level 12_1 will be important in the future, but currently it's supported by like 5% of Direct3D 12 cards, so it won't make any significant impact until maybe 2-3 years from now.

    I have updated the article above with a feature matrix table - hopefully in due time this should clear any confusion relating to tiers and levels.

    No, it doesn't - because there are also optional features indicated capability bits (caps), which made a controversial return in Direct3D 11.1 "thanks" to NVidia not bothering to implement full level 11_1 support.

    Until that, in Direct3D 10.1 and 11.0 new hardware features were exposed strictly through feature levels only.

    But since Direct3D 11.1, all new rendering features in Direct3d 11.2 and 11.3 have exposed as optional using capability bits which apply on top of mandatory features defined by levels 11_0 and 11_1 (and some of them are available for levels 10_0 and 10_1).


    Hopefully this should help:
    en.wikipedia.org/wiki/Direct3D#Feature_levels

    It's rather 95% of level 12_0.

    No, the applications are not techically required to support lower feature layers - however any game developer not supporting NVidia level 11_0 video cards could be deemed legally insane, since these cards consitute a vast majority of Direct3D 12 compatible discrete GPUs.

    Fermi/Kepler/Maxwell-1 make like ~70% of all Direct3D 12 compatible graphics cards - so it is not practically possible for them to be at any disadvantage.

    Share of Intel Haswell/Broadwell CPU graphics is 8%, AMD GCN is 17%, and Maxwell-2 is 5% as of February 2015 Steam hardware survey.

    Doh. It did work for me on build 10074 with driver 1023, and it doesn't work on build 10130 again :(


    PS. BTW Anandtech forum suffers the fanboy attack on AMD for not being "full DX 12.1"... suddenly this thread looks almost like a technical meeting of software engineers!
     
    Last edited: Jun 18, 2015
  8. AxelL

    AxelL Member

    Messages:
    48
    Likes Received:
    0
    GPU:
    ASUS STRIX GTX1080
    This is just the current number not counting those who wait to buy a DX12 card as soon as it actually has some use to it. I can see it significantly changed by than. It still would be a great number though.

    On other note I'm really disappointed in AMD that my card apparently can't be accessed on low level to get more performance out of it because these are the cards who needs it the most and that was the whole purpose of the thing...:bang: (consoles with ****ty hardware could outperform much more powerful PC hardware because of the low level access...)
    I already have strange results in witcher 3 where I have only 3 settings that have any effect on performance (including hairworks! and only one that has more than 10%) and all the other I can set to low or ultra and have the same results which either suggest a badly optimized game or a bad driver.
    I already saw from an expert hat DX currently has about 20-30% effectiveness to it, so it is obvious that drivers have a nice margin to serve marketing purposes. I can understand that they have to sell new cards every year but it should happen because they do better and better products by that margin and not by designing APIs so old cards can't even apply. I refuse to believe that my card couldn't benefit from low level access as much as a lot less complex intel integrated graphics core. I would much rather prefer if they would come out and say "we could, but we just won't give a ... damn" :)

    Anybody remembers this low level programmed demo from 7 years ago of the 4800!!! series? https://www.youtube.com/watch?v=BzquM5Td6bM just think about it...
     
    Last edited: Jun 6, 2015
  9. Jackalito

    Jackalito Master Guru

    Messages:
    575
    Likes Received:
    96
    GPU:
    Radeon RX 5700 XT
    But it works again with leaked build 10134. See my previous comment in this thread ;)
     
  10. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,402
    Likes Received:
    792
    GPU:
    6800XT Nitro+ SE
    By saying Tiled Resources is nothing to worry about NOT having support for... Are you crazy!????

    Tiled Resources is massive, the ability to store and reuse materials in game WITHOUT it taking up VRAM or having to be rendered again.... That is massive!!

    Not only are you not taking up GPU clock cycles or shader time rendering the same object, your also saving memory space too.

    Imagine Skyrim, Witcher 3 with their vast amount of identical trees or rocks. Or GTA V with the vast amount of people and cars on the road...

    Saying this feature is easy to get around because the GPU doesn't support it is silly. This feature is massive in terms of performance, probably even more so than low level access.
     

  11. b2rdark

    b2rdark Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    7870 XT
    wut? gcn does support tiled resources...
     
  12. WarDocsRevenge

    WarDocsRevenge Master Guru

    Messages:
    294
    Likes Received:
    0
    GPU:
    Fury-x Crossfire
    DX12 is built from dx11 and GCN supports dx11.2 which does have tiled resources so yes it does support it
     
  13. dezmand07

    dezmand07 Member

    Messages:
    47
    Likes Received:
    1
    GPU:
    Intel IrisProGraphics 580
    so what about Tahiti XT2 (HD7970) and Tahiti XTL (true R9 280X)?
    and a question: how to distinguish them?, what soft/program can do it (no inscriptions on the box).
     
  14. WarDocsRevenge

    WarDocsRevenge Master Guru

    Messages:
    294
    Likes Received:
    0
    GPU:
    Fury-x Crossfire
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,523
    Likes Received:
    522
    GPU:
    Inno3D RTX 3090
    Most computers that play games have Intel Integrated graphics, that's what most people tend to forget. ANY card that officially supports DX12, will be just fine. PC already has market segmentation due to performance differences, if a developer wishes to segment it even more they might have been able to do something stupid like that, but on the other hand, if the renderer is DX12 IT WOULD STILL RUN on all DX12 hardware.

    All cards get the same kind of access from all APIs. The NVIDIA cards don't make anything magical, they just have a better CPU scheduler for DX11 at this moment.

    The game is actually very well optimized (apart from the whole Hairworks thing). Most settings have minimal impact since the assets of the game were made for one spec really (the console spec). Even the "minimal" impact is quite large on higher resolutions.

    I don't know where that thought came from, but DX12 is shaping up to be one of the things that is going to be really good for AMD hardware. I wouldn't be so happy with a 500/600 series GeForce, for example.

    GCN has had Tiled Resources Support since the mythical time of the Carmacks. Please stop spreading FUD.

    They are exactly the same, they are just different binings of the same chip. There is no difference. If you have a GCN card, you have DX12 and that's that.
     

  16. mtrai

    mtrai Maha Guru

    Messages:
    1,175
    Likes Received:
    369
    GPU:
    PowerColor RD Vega
    Last edited: Jun 6, 2015
  17. dezmand07

    dezmand07 Member

    Messages:
    47
    Likes Received:
    1
    GPU:
    Intel IrisProGraphics 580
    thanks, I hope this is true :)
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,523
    Likes Received:
    522
    GPU:
    Inno3D RTX 3090
    There is no "hope". NVIDIA has had a compute architecture since Fermi, AMD since GCN 1.0, Intel with Haswell. Now everybody who has those architectures are going to have them to work with the new low level APIs. That's it. Why is it so hard to understand?
     
  19. dezmand07

    dezmand07 Member

    Messages:
    47
    Likes Received:
    1
    GPU:
    Intel IrisProGraphics 580
    I mean full support DirectX 12 (D3D_FEATURE_LEVEL_12_0 and D3D_FEATURE_LEVEL_12_1), rather than a set D3D_FEATURE_LEVEL_11_0 and D3D_FEATURE_LEVEL_11_1 from DirectX 11 - 11.3
     
  20. DmitryKo

    DmitryKo Master Guru

    Messages:
    370
    Likes Received:
    118
    GPU:
    ASUS RX 5700 XT TUF
    :bang:

    [post]5090202[/post]

    Unfortunately I can only install official in-place updates, since there is not enough free space on my SSD to run the upgrade from the ISO image...
     

Share This Page