Confused about dithering

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Terepin, Feb 2, 2023.

  1. Terepin

    Terepin Master Guru

    Messages:
    873
    Likes Received:
    128
    GPU:
    ASUS RTX 4070 Ti
    If my monitor is set to 8 bpc, which dithering should I use? Dithering in the same bit depth (8-bit), or dithering into the higher bit depth (10-bit)?
     
  2. janos666

    janos666 Maha Guru

    Messages:
    1,224
    Likes Received:
    226
    GPU:
    MSI RTX3080 10Gb
    Dithering from 8 bit to >=8 bit only adds some noise to the image (static spacial and/or dynamic temporal noise). Dithering from >8 bit to <=8 bit is usually preferable (the noise is much less visible than "banding", if any). You would want dither from the native bitdepth of the source (usually 8bit for SDR and 10 bit for HDR but there are some exceptions) to the native bitdepth of your display panel (which could very well be anything between 6 and 10). But usually there is no point if you display's processor can accept high bitdepth input and do some dithering (ideally optimized for the display panel characteristics). Meaning you can just set the highest available bitdepth in NVCP and forget about manually messing with the driver's dithering options.

    By the way, the driver seems to have dithering profiles hard-coded for certain display devices. For example, NVCP is set to 12 bit but the driver is automatically applies dithering to 6 bit for my LG C9 (which has poor gradation precision in PC mode). But this is G-Sync Compatible (so tested and certified by nVidia).
     
    JAMVA likes this.
  3. Terepin

    Terepin Master Guru

    Messages:
    873
    Likes Received:
    128
    GPU:
    ASUS RTX 4070 Ti
    My monitor can accept 12-bit signal, but I cannot use it due to a bug in drivers. Anything above 8-bit enables DSC and that causes monitor to go black for two seconds every time I alt-tab from and into games running in fullscreen. It's very annoying and nVidia doesn't appear to be willing to fix it. So my only option is to use dithering on GPU. In this regard, I can see improvement in Lagom test even if I set it to 8-bit while the display is also set to 8-bit.
     

Share This Page