AMD confirms GCN, incl. XBO, doesnt support DX12 12_1

Discussion in 'Videocards - AMD Radeon Drivers Section' started by niczerus, Jun 4, 2015.

  1. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Which version of Windows 10?
     
  2. Doom112

    Doom112 Guest

    Messages:
    204
    Likes Received:
    1
    GPU:
    MSI GTX 980 TF V
    Plz dont any BS because this is fake and both AMD and MS said this is fake so plz next time post something base on fact not scram or fanboyism.
     
  3. xacid0

    xacid0 Guest

    Messages:
    443
    Likes Received:
    3
    GPU:
    Zotac GTX980Ti AMP! Omega
    Fake

    [​IMG]
     
  4. thatguy91

    thatguy91 Guest

    It's the feature level that's confusing people. People see 11_1 and think 'oh darn, it doesn't support DirectX 12'. Feature level 11_1 is DirectX 12, as is 11_0. The performance difference for the typical game on a typical card is supposedly really not much different between feature level 12_1 and 11_1, with only 11_0 suffering a little. This isn't to say the higher tiers aren't better, just that they provide scope for improvement in the future that we're unlikely to see at least for a while.
     

  5. db87

    db87 Member

    Messages:
    28
    Likes Received:
    3
    GPU:
    Gainward GTX 1070 @ 2025
  6. moab600

    moab600 Ancient Guru

    Messages:
    6,660
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    Another nvidia shady move? seems they toying with Kepler users now "buying" dx12, whatever whatever we see the real winner in games, no rumors or pointless benchmarks.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    It is not fake, this is taken directly from the MS presentation. It shows binding tiers, and it is correct.
    You seem to confuse binding tiers with feature levels. The information he posted is actually accurate, please read the thread at least before calling people out.
     
  8. pesok

    pesok Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    295x2-34" 21:9 3.5k IPS
    Sorry if this was answered already, and i apologize in advance for possibly not using the correct terminology - i am a gamer and while i am also a software developer(since 99) - i never did anything related to graphics or gaming even remotely, but the one of the most exciting features of DX12(or D3D12 rather) for me personally is video memory pooling which will affect all owners of multi-GPU cards (or Cross/Tri/Quad-fire/SLI). I currently have R9 295X2 (aka Crossfired R9 290x's) with each GPU having 4GB of vram. DX11 only able to utilize a total of 4GB since it does sequential framing(or whatever the correct term is when each GPU is responsible for drawing 1 full frame) and therefore has to keep the same exact copy in each 4GB buffer effectively making it 4GB+4GB=4GB in D3D11 . From what i understood D3D12 will be able to use each GPU to draw only half of each frame and therefore will be able to keep different set of textures/whatever in each 4GB effectively making me believe that in D3D12 4GB+4GB will equal 8GB. Is that technology included in whatever Tier DXD12 that my R9 295x2 is compatible with or (and i hope thats not the case since i wasn't planning on upgrading my Graphics card anytime soon) would i need a card that supports higher Tier DX12??
     
  9. xacid0

    xacid0 Guest

    Messages:
    443
    Likes Received:
    3
    GPU:
    Zotac GTX980Ti AMP! Omega
    Please educate me. Any link to the presentation/video?

    So the tables is all about binding tiers?
    I thought this is the table for Binding Tiers
    [​IMG]

    I want to point about Conservative Rasterization and Raster Order View(ROV) which show YES in the table but none of the GCN 1.1/1.2 support feature level 12_1 because it don't support both?

    [​IMG]

    And this is from Wiki courtesy of DmitryKo.
    [​IMG]
     
  10. DmitryKo

    DmitryKo Master Guru

    Messages:
    447
    Likes Received:
    159
    GPU:
    ASRock RX 7800 XT
    As I said [post=5079171]in another thread[/post], that picture does not come from Microsoft or any other vendor - it's been posted on a Brazilian forum and its content copies a table posted on a Hungarian forum, which is probably based on this thread on Anandtech forum). It's not up to date with the most current Windows SDK 10.0.10069 and most information provided in this table is not correct.


    On the other hand, everything I posted on Wikipedia is directly based on presentation made by Microsoft and hardware vendors with all citations referenced, as well as data collected with my feature reporting tool as run on latest Windows 10 builds and WDDM 2.0 drivers.

    Wikipedia - Direct3D 12 feature levels
     
    Last edited: Jun 7, 2015

  11. DmitryKo

    DmitryKo Master Guru

    Messages:
    447
    Likes Received:
    159
    GPU:
    ASRock RX 7800 XT
    It's a separate baseline capability that does not require any feature levels or other tier, called Explicit multi-adapter.

    https://msdn.microsoft.com/en-us/library/windows/desktop/dn933254(v=vs.85).aspx

    If you have two exactly similar GPUs, they will be linked together as if it's one big adapter, but the developer will also have direct access to each adapter's memory and command queue, and will be able share resources and graphics commands across multiple adapters. The implementation details will be handled transparently by the Direct3D 12 runtime and video card driver.

    Unfortunately current WDDM 2.0 drivers don't seem to support this yet, they only support implicit vendor-specific model from Direct3D 11, where everything is managed automatically by the driver and the developer has no explicit control.


    Developers will also be able to use multiple adapters from different vendors, but they will have to perform sync and load balancing, which is very tedious to implement properly - though the Direct3d 12 API now offers some basic features to aleviate this. Epic Games recently showed an Unreal engine demo which used Intel graphics to offload some tasks.

    http://blogs.msdn.com/b/directx/arc...rmant-silicon-and-making-it-work-for-you.aspx



    So "nVidia "is buying" Directx12 features level to put in a bad light AMD GCN "... Oh really - nVidia "bought" a feature level 12_1 that exposes capabilities of their latest hardware? All to put AMD "in a bad light". Well that's shady.

    Wait, AMD and Intel are "buying" feature level 12_0 too! AMD (ATI) also "bought" feature levels 11_1 and 10_1, and so did Intel.
    Nvidia "bought" SM 2.0a and PS 1.0-1.1, ATI "bought" SM 3.0, SM 2.0b and PS 1.2-1.4. And before that, Nvidia also "bought" hardware T&L and one-pass multitexture.
    I wonder how much money Microsoft really made by "selling" all these features to hardware vendors?
     
    Last edited: Jun 11, 2015
  12. thatguy91

    thatguy91 Guest

    The load balancing across different adaptors is something I believe AMD are, or at least were, working on. However, their implementation is driver level, so is independent on what game developers do. Also, it is to unify APU's with discrete cards.

    If they do manage to accomplish this, it may be more of an attractive preposition for gamers in the future to go for AMD Zen APU plus an AMD discrete card, which is what I believe they are aiming for. Basically, you instantly, effectively, get a graphics upgrade. Of course, the amount of benefit remains to be seen, but could be fairly significant.
     
  13. Dygaza

    Dygaza Guest

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    Are these feature level supports final, or can we expect some changes from new driver versions? Not that it really matters as most critical thing is to get dx12 rolling alltogether.
     
  14. Wagnard

    Wagnard Ancient Guru

    Messages:
    2,746
    Likes Received:
    519
    GPU:
    MSI Geforce GTX 1080
    I'll repeat myself here.
    This is wrong, it was confirmed on the geforce forum by ManuelG himself.

    Source: https://forums.geforce.com/default/...he-directx-12-features-/post/4497236/#4497236

    Also, here is another interesting article: http://www.bitsandchips.it/52-engli...out-tier-and-feature-levels-of-the-directx-12
     
    Last edited: Jun 8, 2015
  15. WarDocsRevenge

    WarDocsRevenge Guest

    Messages:
    295
    Likes Received:
    0
    GPU:
    Fury-x Crossfire

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    What he says is in essence correct. There will be no difference in any actual title. It is like DX11. You either support it, or you don't. Nobody from either camp will be seeing different things than the others.
    For the rest, please just read DmitryKo's posts. Most of the rest posted here (including some of mine), were just bull****.
     
  17. sajibjoarder

    sajibjoarder Master Guru

    Messages:
    607
    Likes Received:
    1
    GPU:
    gtx 1080 FE
    i dont think any of amd gpu will support Conservative Rasterization Tier 1, Rasterizer Ordered Views. not even the radeon furi or fiji or 390x or whatever it is. but this remind me a story i had an Radeon X800 GTO 256MB long ago it was a ps2.0 (dx 9.0b ) compatible card... i had to buy a new card when nfs carbon came out.. and it was within 8 months.. right what im thinking i have to change my 290x xfire system next year ( which i had to do even if there was no new dx 12. but i expected this setup to last longer. becaz i used gtx 480 sli for a long time, almost 3 years)... even a hd 5770 served a year..


    installed windows 10 twice couldnt use dx 12 feature test.. some people said its becaz crossfire isn't supported yet.. i dont know whats wrong. im wating for the final release of windows 10 and drivers from amd that supports wddm 2.0.
     
  18. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    I don't get the reason why so many are bitching.

    When you originally bought the GCN 1.0 line up, there weren't any signs on DX12 at all. Yet, your hardware is able to support it. Maybe not all of it's features, but to an extent where you would see benefits of DirectX 12. In the past, we have seen AMD have had support for DirectX 10.1, 11.1 opposed to nvidia cards supporting just DirectX 10 and 11, but that didn't ruin nvidia's compatibility nor performance for the titles that were shipped. And I agree with Roy that 12_0 and 12_1 isn't gonna make a difference.
     
  19. Dygaza

    Dygaza Guest

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    I'm not worried about performance of my card (gonna update soon anyway). More in ways that I would like to see all architectures as high in tiering as possible so devs could use as many features as possible freely. But even with currents feature levels they are looking rather good.
     
  20. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    I am curious what exactly does the stencil reference in pixel shader since only AMD supports it, is it like some kind of optimization technique?
    It is not referenced anywhere in the Direct3D standards. Or is it the same as specific stencil reference? In previous tables it was shown that Nvidia supports that too.
     
    Last edited: Jun 8, 2015

Share This Page