GCN compatibility/support for DirectX, OpenGL etc. I am little confused Maybe someone will help me. How things look about how GCN cards(from 7000 series to Rx 300 series) are compatible for DirectX, OpenGL, OpenCL, Mentle, Vulcan API. What i know and almost sure DirectX 12.0 GCN 1.0 on 11.2 future level GCN 1.1/1.2 on 12.0 future level Mantle All GCN card support Mantle OpenGL All support OpenGL 4.4 ( not sure about 4.5) OpenCL GCN 1.0 support OpenCL 1.2 (not sure about 2.0/2.1) GCN 1.1/1.2 support OpenCL 1.2 and 2.0/2.1(beta drivers) Vulcan All GCN card support Vulcan?? Probably there are also differences between card generations because of driver support
Forget about the feature levels of DX12. They are market speak. All AMD GCN cards and all Nvidia cards from Fermi (GTX 4xx) and up support DX12/Vulkan. That's that.
depends what dx12 tier they are. Even if it's marketing you will loose features if the card is not tier 3. Most of the GPUs support the first tier or TIER1. Maxwellv2 (GM206 and GM204) support the second tier or TIER2. All GCN-based Radeons support the third tier or TIER3. I'm expect that all future hardware will support TIER3. Here's some info about this: https://intel.lanyonevents.com/sf14...96E33739827241E4DD51A76/SF14_GVCS005_101f.pdf
Just going to add which products support DX12: AMD Radeon™ R9 Series graphics AMD Radeon™ R7 Series graphics AMD Radeon™ R5 240 graphics AMD Radeon™ HD 8000 Series graphics for OEM systems (HD 8570 and up) AMD Radeon™ HD 8000M Series graphics for notebooks AMD Radeon™ HD 7000 Series graphics (HD 7730 and up) AMD Radeon™ HD 7000M Series graphics for notebooks (HD 7730M and up) AMD A4/A6/A8/A10-7000 Series APUs (codenamed “Kaveri”) AMD A6/A8/A10 PRO-7000 Series APUs (codenamed “Kaveri”) AMD E1/A4/A10 Micro-6000 Series APUs (codenamed “Mullins”) AMD E1/E2/A4/A6/A8-6000 Series APUs (codenamed “Beema”) Here's a nice little chart to clear things up a bit.
That chart is wrong for kepler and maxwell 1.0, please stop posting that... Moreover there is no tier2 for typed uav formats. I also wonder who published that image first :whip: That's related to resource binding, which impact the resource management of applications, not the rendering capabilities. It is NOT marketing, it's a development thing. Consumers should not care about that. If your GPU support DX12 it support DX12, fullstop. None of the single rendering capabilities supported by 1-3% of GPUs are going to be a requirement in the following years.
It has been used by official and unofficial marketing for sometime, that's what I meant. I completely agree with you, nobody will care, and only outrage will be had with any title that won't run on a DX12 GPU because of a stupid feature level. The only thing I can see is NVIDIA PubeHair trying to take advantage of that 12_1, so that others can't even enable the effects any more. One question: What do resource binding tiers do in practice? Are they transparent? Is a Tier3 GPU more efficient automatically at some things? What kinds of things? Do the features need to be specifically coded in?