Since when did nVidia allow 10 bit colour on non-pro cards?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Neo Cyrus, Aug 15, 2015.

  1. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Games typically use 8bit. So you're seeing posturization.
     
  2. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Desktop is still 8bit only, even with Pascal GPUs. 10bit is only available with Full Screen Exclusive mode.

    Which by the way if you run MPC-HC with Madvr you have to use FSE instead of Windowed mode to get 10bit. Even if the Nvidia Control Panel has 10bit selected.
     
  3. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,931
    Likes Received:
    1,044
    GPU:
    RTX 4090
    What's this changes then?

    [​IMG]
     
  4. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    He just said it's for full screen exclusive mode.
     

  5. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Yes, 10bit only with FSE DirectX 11 for Geforce cards, Quadro GPUs allows 10bit on the desktop and also with OpenGL, Geforce cards are still limited to 8bit OpenGL even with FSE as far as I know.
     
    Last edited: Mar 28, 2017
  6. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,931
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Fullscreen exclusive override whatever mode you're setting for the desktop, hence why it's "exclusive" so no, it's not. Application is setting FSE by itself.

    If the desktop is still 8 bit then why is this option even there? People also reporting that they have issues with selecting 10+ bits in 4K@60+ here so it does indeed seems to increase bandwidth usage which directly suggests that it's actually does enable 10/12 bits on the desktop.

    The only issue here may be that Windows itself doesn't actually support anything but 8 bit for the desktop composition - I think I've read something on this related to 1703 update.
     
    Last edited: Mar 28, 2017
  7. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Huh? The bit depth is being set for FSE, not the desktop. That FSE overrides desktop settings is irrelevant as that option is intended for FSE, not for desktop.

    Because as they just said, it only affects FSE...

    It could easily enable 10-bit signaling but only use the last 8 bits for Windows as it renders in 8-bit then scale up to using the full 10-bit whenever a supporting application is run in FSE. This is, for instance, how you can run a 6-bit color game while still sticking to 8-bit signalling. This is how future higher bit depth monitors will be running older 8-bit content. Keeping the higher bit depth signal enables more accurate color correction (as now your color palette for mapping 8-bit colors to their accurate values is 10-bit). AMD already uses 10-bit LUTs and dithers from 8-bit when applying 10-bit+ color profiles (e.g. 16-bit).

    Same point. Windows desktop is run in "compatibility mode" where only the last 8 bits are used.
     
  8. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,931
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Again: FSE overrides whatever settings user makes for the desktop - or anything else. This is why it's FSE, otherwise there would be no reason to even have it. I very much doubt that changing the bit depth in CPL have any effect on what's used in FSE by the programs which are using it.

    It? Use? Windows is using the h/w, not the other way around. If Windows desktop compositor is using 8 bit only then there will be no difference no matter which bit depth you select for the desktop.

    No idea what "6 bit games" have to do with it.
     
  9. David Lake

    David Lake Master Guru

    Messages:
    765
    Likes Received:
    46
    GPU:
    Titan V watercooled
    The Windows desktop renders at 8 BPC.

    If you set your display to 10 bit or more the signal being sent to it will contain that many bits per colour regardless of what the format being sent to the card, the card then scales the bit depth of the software to that of the display.

    Bit depths of 10 or higher are necessary on bright displays with a high contrast ratio and HDR video because the step between luminance levels becomes greater.

    On professional displays that are likely to be calibrated the increased bit depth eliminates the gaps between some luminance levels that are a result of a non-linear luminance curve.
     
    Last edited: Mar 28, 2017
  10. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Do you have evidence for your doubt?

    I don't understand what you didn't get from what I said. The desktop renders at 8-bit, regardless of the monitor signal. 6-bit games work in the same way on 8-bit monitors. You use less bits than what is available per color channel. That's it.

    The driver controls the monitor signal, allowing software that supports 10-bit output to use that. The desktop renders at 8-bit so it stays 8-bit. That doesn't mean other applications cannot use 10-bit.
     

  11. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Desktop 8bit is only for Geforce GPUs, nothing to do with Windows it's all the driver working as far as we know there's no way to tell if the driver is sending actual 10bit and not just dithering because the driver doesn't exposes this to developers even AMD which has 10bit desktop has weird behaviors with some drivers allowing but others blocking it, driver hacks and having to disable windows composition.


    "The only reliable way I know is if your receiver or TV report the incoming bitdepth. If madVR sends 10bit to the GPU driver, the GPU driver could still ouput it as dithered or undithered 8bit, 10bit, 12bit or 16bit. I've no control over that, I can't even ask the GPU which bitdepth it actually outputs."" - quoting madshi, madvr developer.

    From Nvidia
    http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus
     
    Last edited: Mar 28, 2017
  12. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,931
    Likes Received:
    1,044
    GPU:
    RTX 4090
    https://msdn.microsoft.com/en-us/library/windows/desktop/bb173064(v=vs.85).aspx

    FSE is a display mode which the application set up completely by itself, with resolution, refresh rate and color depth, among other things, being specified during the creation of the FSE surface.

    A. The desktop remains the same 8-bit desktop because the Windows desktop (as in the DWM.exe application which renders it) doesn't support anything else yet. It's the same between all h/w on the market and should change with the upcoming Win10 1703 update only which should make the desktop compatible with HDR at least.

    B. Currently NV cards support 10-bit output for both FSE and windowed DX applications. The simple fact is however that there are zero (?) DX applications which are able to use 10 bit color mode while being windowed but this has nothing to do with what the h/w allow or not, only what the applications actually ask from it.

    C. The 8-bit limitation which is in place on GeForce cards compared to Quadro cards is relevant only to OpenGL surfaces (no difference between windowed or fullscreen for them) as all professional s/w which is using 10 bit output is using OpenGL for this (as it gained the ability to present deep color surfaces way earlier than Windows).

    You're linking to an article from 2011, it's outdated by now.
     
  13. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
  14. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,931
    Likes Received:
    1,044
    GPU:
    RTX 4090
  15. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    8bit vs 10bit without dithering will be very obvious which is what the test suggest doing, the OpenGL application won't even launch without a 10bit capable Monitor+GPU, a Pascal GPU like the 1080 will probably fail to launch it.

    Even in a 8bit panel, 8bit+dithering vs 8bit+undithering there is a clear difference.

    People are welcomed to post their results.
     
    Last edited: Mar 30, 2017

  16. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Indeed. My TV reports 12-bit input (confirming my NVCP settings) but this software quits.

    Real life is never black and white. I have no idea about the actual "native" gradation of my display but all PDPs use tons of dithering (even this old Kuro which does a pretty neat job compared to others). However, I can clearly tell the difference (8 vs 10 bit) using a synthetic test pattern (and madVR). I think it's equally beneficial to pass 10+ bit to displays like this (which always use heavy dithering) because this lowers the chance of different kind/form/shape of dithering algorithms/patterns to produce "funny" artifacts when layered on top of each others (especially in this case, the display probably knows a lot better how to dither properly because it's specifically optimized to the nature of the PDP technology).
     
    Last edited: Apr 9, 2017
  17. chinobino

    chinobino Maha Guru

    Messages:
    1,140
    Likes Received:
    75
    GPU:
    MSI 3060Ti Gaming X

Share This Page