WDDM 2.2 in Creators update and drivers

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Mar 15, 2017.

  1. Carfax

    Carfax Ancient Guru

    Messages:
    3,972
    Likes Received:
    1,462
    GPU:
    Zotac 4090 Extreme
    Nope, everything running smooth here. I was on the fast release upgrade as well, but I did a clean install on Thursday since the RTM is already out.

    Try using DDU to do a clean install of the drivers, then reinstall.
     
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    HDR10 and/or Rec2020 can be supported on old Windows through proprietary driver extensions, Win10 just offers a standardized way of doing it.

    What does ME Andromeda use now? (I can't tell, I am not a "Frostbite insider" and I don't have an HDR10 display but I know it supports HDR10.)

    ---

    I personally hoped the UHD era with ITU HDR10 and DolbyVision will finally bring 10-12 bit/color support to old SDR devices (obviously with mapping the output to Rec709 at ~120nit but keeping the higher gradation).

    My video cards have been feeding my old TV with 10 (and recently even 12) bit/color through their HDMI output for many years now (DeepColor was a widely adopted standard in the FullHD era, at least among display manufacturers at basic compatibility levels). It's just that no PC software ever cares (with a few small exceptions like madVR for media players or the Alien Isolation game) to pass 10+ bit to the display instead of sticking to 8 bit.

    Sadly, it seems UHD HDR10/DV10 changed nothing in this regard. More and more games will support HDR10 while ignoring 10+ bit/color for SDR. I can't even trick them into outputting HDR10 to an SDR display and let me process their output with ReShade (to convert from Rec2020 HDR10 ~1200nit to Rec709 ~120nit while keeping it 10 bit/color ; or even use the actual display profile which can exceed these SDR specs by ~120%).
     
    Last edited: Apr 9, 2017
  3. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,938
    Likes Received:
    1,045
    GPU:
    RTX 4090
    NVAPI.

    Yeah, that was my hope as well but alas you're right, the only way to get WCG in 1703 is if you have HDR10 display. And even then it seems to be broken on the desktop at the moment.
     
  4. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,938
    Likes Received:
    1,045
    GPU:
    RTX 4090
    Btw, here's whats D3D12 Feature Checker shows for GTX1080 on 381.65 drivers in 1703:

    Code:
    Direct3D 12 feature checker (March 2017) by DmitryKo
    https://forum.beyond3d.com/posts/1840641/
    
    Windows 10 version 1703 (build 15063)
    
    ADAPTER 0
    "NVIDIA GeForce GTX 1080"
    VEN_10DE, DEV_1B80, SUBSYS_1B8010DE, REV_A1
    Dedicated video memory : 3221225472  bytes
    Total video memory : 4294901760  bytes
    Maximum feature level : D3D_FEATURE_LEVEL_12_1 (0xc100)
    DoublePrecisionFloatShaderOps : 1
    OutputMergerLogicOp : 1
    MinPrecisionSupport : D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE (0)
    TiledResourcesTier : D3D12_TILED_RESOURCES_TIER_3 (3)
    ResourceBindingTier : D3D12_RESOURCE_BINDING_TIER_2 (2)
    PSSpecifiedStencilRefSupported : 0
    TypedUAVLoadAdditionalFormats : 1
    ROVsSupported : 1
    ConservativeRasterizationTier : D3D12_CONSERVATIVE_RASTERIZATION_TIER_2 (2)
    StandardSwizzle64KBSupported : 0
    CrossNodeSharingTier : D3D12_CROSS_NODE_SHARING_TIER_NOT_SUPPORTED (0)
    CrossAdapterRowMajorTextureSupported : 0
    VPAndRTArrayIndexFromAnyShaderFeedingRasterizerSupportedWithoutGSEmulation : 1
    ResourceHeapTier : D3D12_RESOURCE_HEAP_TIER_1 (1)
    MaxGPUVirtualAddressBitsPerResource : 40
    MaxGPUVirtualAddressBitsPerProcess : 40
    Adapter Node 0: 	TileBasedRenderer: 0, UMA: 0, CacheCoherentUMA: 0, IsolatedMMU: 1
    HighestShaderModel : D3D12_SHADER_MODEL_5_1 (0x0051)
    WaveOps : 1
    WaveLaneCountMin : 32
    WaveLaneCountMax : 32
    TotalLaneCount : 40960
    ExpandedComputeResourceStates : 1
    Int64ShaderOps : 1
    RootSignature.HighestVersion : D3D_ROOT_SIGNATURE_VERSION_1_1 (0x0002)
    DepthBoundsTestSupported : 1
    ProgrammableSamplePositionsTier : D3D12_PROGRAMMABLE_SAMPLE_POSITIONS_TIER_2 (2)
    ShaderCache.SupportFlags : D3D12_SHADER_CACHE_SUPPORT_SINGLE_PSO | LIBRARY (3)
    So no SM6 support in the driver yet.
     

  5. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,959
    Likes Received:
    1,246
    GPU:
    .
  6. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,938
    Likes Received:
    1,045
    GPU:
    RTX 4090
    Does it work there? I thought that DXIL compiler is still missing.
     
  7. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,959
    Likes Received:
    1,246
    GPU:
    .
  8. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,938
    Likes Received:
    1,045
    GPU:
    RTX 4090
    Hardware GPU support for DXIL is provided by the following vendors:

    NVIDIA r378 drivers (r378.49 and later) provide experimental mode support for DXIL and shader model 6. This is an early beta version to enable developers to try out DXIL and the new shader model 6 features – Wave Math and int64. Only DXIL version 0.7 (beta) is accepted by the r378 driver. Experimental mode support for DXIL v1.0 will be provided in a future driver release.


    +

    The NV r381 driver branch supports DXIL 1.0


    So it should work in experimental mode on NV h/w. I wonder how far we are from production release though.
     
  9. dezmand07

    dezmand07 Guest

    Messages:
    47
    Likes Received:
    1
    GPU:
    Intel IrisProGraphics 580
    For check HLSL Shader Model 6.0 need get members WaveOps (bool type) from D3D12_FEATURE_DATA_D3D12_OPTIONS1 structure
    https://msdn.microsoft.com/en-us/library/windows/desktop/mt733232.aspx
    https://msdn.microsoft.com/en-us/library/windows/desktop/mt709115.aspx

    Code:
    #include <iostream>
    #include <wrl\client.h>
    #include <d3d12.h>
    
    #pragma comment (lib, "d3d12.lib")
    
    int main() {
    
    	Microsoft::WRL::ComPtr<ID3D12Device> device = nullptr;
    	D3D12CreateDevice(nullptr, D3D_FEATURE_LEVEL_12_1, IID_PPV_ARGS(&device));
    
    	D3D12_FEATURE_DATA_D3D12_OPTIONS1 data = { 0 };
    	device->CheckFeatureSupport(D3D12_FEATURE_D3D12_OPTIONS1, &data, sizeof(data));
    
    	if (data.WaveOps) std::cout << "HLSL Shader Model 6.0 support" << std::endl;
    	else std::cout << "HLSL Shader Model 6.0 not support" << std::endl;
    
    	std::cin.get();
    	return 0;
    }
    [​IMG]

    By the way, Intel Iris Graphics 580 don't support WDDM 2.2 (driver version 22.20.16.4729)
    [​IMG]
     
    Last edited: Jul 29, 2017
  10. chinobino

    chinobino Maha Guru

    Messages:
    1,140
    Likes Received:
    75
    GPU:
    MSI 3060Ti Gaming X
    I don't think Intel supports WDDM 2.2 on any Skylake iGPU (yet).
     

Share This Page