DirectX 12

Discussion in 'Videocards - AMD Radeon Drivers Section' started by trocio2, Mar 7, 2014.

  1. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC


    It's not that wrong... Radeon 5xxx and 6xxx will start struggling soon, and will be much worse for them in one year or two....

    So, it may be a wise decision to concentrate the efforts on newer hardware.

    I was shocked when the Radeon HD4200 from my laptop was going to "legacy" support, but then I realized I spent SIX MONTHS without updating its drivers just because I didn't need.... I missed TWO drivers updates by that time and never noticed.

    nVidia are bragging ALL their DX11 hardware is DX12 capable, but they forgot they don't even support Directx 11.2 yet... Also, a Geforce 460 won't be able to play anything (except at minimum config) in two years... Why bother with DX12 on old hardware?
     
    Last edited: Mar 22, 2014
  2. elpsychodiablo

    elpsychodiablo Master Guru

    Messages:
    349
    Likes Received:
    0
    GPU:
    Retina Z2 + Vlab Motion
    I m sure there will be some exclusive dx12 features which just work with special dx 12 cards. Now they give candy features for old Dx11 cards for the community but after a while they put their balls out and show some kick ass features which you just can use with windows 9 and native dx12 cards, and everyone upgrade.

    Dont forget we talk about Microsoft.
     
  3. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Props to nvidia to supporting fermi onward with dx12.
     
  4. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Without knowing how DX12 will perform, there is so way to know if legacy AMD hardware would be physically usable or not, nor do we know if somethig like a 460 will enable a playable experience.

    What people are missing is that DX12 pretty much explains why Nvidia does not support DX11.2, if what we are reading is correct then it's clear 11.2 was never going to be used for PC gaming.
     

  5. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    That doesn't mean Fermi will perform acceptably with DX 12 games... read my previous post.
     
  6. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    I see nothing factual in your post. Only nvidia and microsoft actually know how fermi will run dx12 You seem to be trying to justify amd only supporting gcn cards.
     
  7. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    It's kind of clear whether three year old hardware is going to be able to run gpu intensive games with mega resolution textures. I just moved from a Geforce 560Ti that was already struggling to keep playable framerates at "medium" on Battlefield 4.

    And I'm not trying to advocate in favor of AMD here. But nVidia doesn't support DX11.2 and barely supports DX11.1 on their flagship cards, and that is a FACT (except for the next gen, not yet launched cards and the 750/750Ti, which is entry-level). If game developers are even going to use DX 11.2 or DX12, it's still a mystery, but my card supports both. There are several DX9 games being launched still and very few that actually requires DX11 capable hardware.

    Anything beyond this is fanboyism and I refuse to enter such matters.

    I'm just glad to know my recently bought VGA supports the latest tech. Even if ONE game eventually uses it. :stewpid:
     
  8. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    There's nothing factual because DX12 isn't even out yet... And we can't test it, can we?

    And yes, I actually agree that AMD supports DX12 on newer cards, and I don't see any problem on that. DX12 games are going to be out in 2015 Christmas, and I don't even know if my current card will be able to produce acceptable performance on high settings with them at that time.


    edit:

    Sorry for the double post, but I had already posted the other one before you guys answered me... LOL
     
  9. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    The issue here is that your comparing it with old versions of DX when we are nearly 2 years away from it being released.
    While your claiming a 560ti is not enough, Intel and Qualcomm are saying this is great news for lower end hardware.

    Your also completely missing the glaringly obvious issue with DX11.2, in that there will be no DX11.2, and that is why Nvidia has no support for it.
     
  10. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    OK.

    You missed my point. Forget it.
     

  11. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    I did not miss your point, all i can see is that i just did not agree with it.

    I'm not trying to argue, but if you think i'm taking something the wrong way then you should correct me.
     
  12. thatguy91

    thatguy91 Guest

    Nvidia doesn't seem to be interested in any of the .x's with DirectX. Think back to Directx 10.1, something Nvidia did not support for a very long time. DirectX 10 was pretty bad performance wise, Directx 10.1 rectified those issues. Developers were probably put off by DirectX 10 and stuck with 9.0.

    History lesson:
    The original Bioshock game had DirectX 10.1 support but was an Nvidia title. It actually performed better on AMD hardware, particularly with antialiasing enabled. DirectX 10.1 support was pulled after the reviews came out showing AMD on top, the developer citing stability reasons as the cause of it being dropped. Of course, any Directx 10.1 issues only affected AMD... and there were no issues! I can't help but feel Nvidia not wanting to change their hardware for so long (I believe they even brought out a new GPU line whilst this was going on) actually disadvantaged PC gaming.
     
  13. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    That isn't a history lesson though, it's pure fiction.
    I have never heard of Bioshock ever having support for DX10.1.
     
    Last edited: Mar 22, 2014
  14. Valagard

    Valagard Guest

    I hope the next DX fails spectacularly and moves on to openGL and mantle

    Then I won't need to buy a new OS every time MS decides to upgrade DX anymore
     
  15. Espionage724

    Espionage724 Guest

    I wish developers would just stop limiting their audience and making more work for themselves, and just use OpenGL... Or Mantle, if it actually gains cross-platform support.
     

  16. thatguy91

    thatguy91 Guest

    AMD not supporting non-GCN hardware for DirectX 12 isn't surprising, they have to be practical about it. Remember DirectX 12 has a release date of December 2015, meaning realistically 2016. They have to look at how many people will still be using non-GCN cards in 2017 (you would want at least two years to make it worthwhile), especially considering it is a lot more work to support those cards.
     
  17. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Using Mantle or OpenGL is limiting their audience, and is definitely making more work for themselves.

    DX12 in 2016 would be the largest audience with the least amount of work.
     
  18. thatguy91

    thatguy91 Guest

    That's because I was actually thinking of Assassin's Creed :). Google it.
     
  19. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    I know, everyone knows about that, but you've been saying Bioshock for a while now.

    AC was full of issues with DX10 that every game after for several years was DX9 only, maybe Nvidia should of worked to fix it but it was causing stability issues so it had to be removed, and Ubisoft are rarely know for doing this the hard way.
     
  20. Espionage724

    Espionage724 Guest

    From my understanding of the matter, GCN is more flexible in-terms of what kind of rendering APIs and features it can support (just need to make a driver to handle it); that's what I've heard anyway.

    Non-GCN cards I assume lack certain hardware requirements to do certain things, or would be too performance-degrading to implement.

    I'm sure someone would have a better explanation though :p

    How is OpenGL (cross-platform with OGL 3.0 and 4.0 being pretty widely available on current and future hardware) limiting their audience in-comparison to a newer-edition-of-Windows and modern-GPU (for partial support; newer GPU for full support) DX12?

    I can understand Mantle being a bit restrictive, but that's assuming Intel nor NVIDIA pick it up (which is likely the case). But in any case, Mantle should probably just be a companion renderer API to something more popular (like OpenGL :p).

    Plus, the amount of work needed only really changes depending on what you use as the main renderer. If I make a game and use DX11, and want to port it over to Linux, I now have to put an OpenGL renderer in my game, and make it work well.

    If I make a game for Windows and use OpenGL, and want to port it to Linux, I have very little work required to do so (for the renderer API portion anyway; there's still other APIs to worry about).
     
    Last edited by a moderator: Mar 22, 2014

Share This Page