Nvidia and dithering controls..how to enable?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by jarablue, Feb 6, 2021.

  1. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    It works, Windows just randomly breaks gamma ramps precision. Try changing resolution once via Nvidia Control Panel after changing the registry keys and restart.
     
  2. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    Tried that and still nothing. I can still see color banding in here and here.
     
  3. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    Because these are 8 bit pictures, banding is already in the sources. Dithering can't fix that. In desktop usage, dithering only helps to prevent additional banding when applying GPU gamma ramp adjustment with enough precision (driver color options or loading 1D LUT via DisplayCal loader or Calibration Tools).
     
  4. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    Whatever the case, I saw no difference anywhere. Not in any of the test patterns, not in the games, not in the movies, anywhere on desktop, nor apps. And let me tell you that I have banding EVERYWHERE and ironically even in NVCPL itself.
     

  5. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    Banding in the source again (everyone has it, e.g. on the PhysX page), like probably the case for your other cases too (yes, it's ~everywhere).
    Some displays also make it even a lot worse.
     
    Terepin likes this.
  6. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    Yeah, I figured as much. My theory is that it was always there, but once I started no notice it, I cannot unseen it.
     
  7. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    Say, what is the probability of GPU causing the banding?
     
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I think it's your display. Not all monitors (and especially HDTVs or monitor-TV combo devices) are capable of presenting smooth gradients (gray or any other color gradients for that matter) and it's often dependent on settings.
    Most displays have decent gradients with correct settings which are usually the so called "neutral" or "no-op" settings where their processors do minimal work. But as soon as you touch anything except the backlight intensity (or on rare occasions even that one suffices) everything is thrown out of place quickly and heavily (banding and colorization occurs, often heavily even from the slightest adjustment of say the Contrast control but also from the R,G,B Gain controls which are actually per-channel contrast controls, etc).
    So my advice is: reset the display and adjust the backlight intensity. Then try different presets. Standard may or may not work best but avoid things like Photo or Game mode (yes, game mode.... that's often less accurate than most and it's not always required to have lower latency since it's often just yet another color preset rather than an actual game mode with no special low-latency pipeline).
     
  9. Dagda

    Dagda Master Guru

    Messages:
    324
    Likes Received:
    85
    GPU:
    RTX 2080 super
    does the regedits still works?
     
  10. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    1st post of this page (yes).
     

  11. CalinTM

    CalinTM Ancient Guru

    Messages:
    1,689
    Likes Received:
    18
    GPU:
    MSi GTX980 GAMING 1531mhz
    I have a aorus fi25f display. What panel do I have 8 or 10 bit. Because I can switch between 8 and 10 in nvcp.
     
  12. SaiBork

    SaiBork Master Guru

    Messages:
    290
    Likes Received:
    45
    GPU:
    3080 Ti / 4090
  13. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    When the display offers 10 bit, it most likely won't simply clip at 8 bit. Funny thing is that Nvidia enables dithering for 10 bit by default. Though most 10 bit displays are actually 8 bit + FRC, so you have dithering twice (so much for people who claim they could see dither patterns, this should basically dither their eyes out).
     
    dr_rus and Cave Waverider like this.
  14. ChaosDMNS

    ChaosDMNS New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    GTX 1070
    https://github.com/ledoge/novideo_srgb

    This works.. the newest version added dithering option. you just need a calibrated icc profile to use with the program.make sure the ICC it's not in use by windows/installed in windows. copy the ICC to the novideo_srgb folder to be sure.
     
  15. dani_tx

    dani_tx Member

    Messages:
    23
    Likes Received:
    6
    GPU:
    EVGA GTX 1080 FTW
    @ChaosDMNS will that program work on a laptop or it's made for standalone displays only?
     

  16. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    It only works when the NV GPU renders the desktop and I assume the display also needs to be connected to it and not the IGP.
     
  17. EnthusiastX

    EnthusiastX Active Member

    Messages:
    53
    Likes Received:
    3
    GPU:
    EVGA RTX 3090
    Wow - thanks! That tool is great. I can see obvious dithering when I select 6bit. Dithering is there with 8bit and 10bit too.

    There is a known rendering tool, madVR, that has incredible dithering. It can perform 16bit dithering and then convert it back to whichever depth your panel has. Does NVidia dithering work the same way? Should I select 8bit dithering for 8bit (no FRC) VA panel or just the highest dithering level NoVideo-sRGB tool allows (10bit)?
     
  18. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,483
    Likes Received:
    1,869
    GPU:
    7800 XT Hellhound
    I think you mean it does processing in 16bit (probably floating point format) and then converts to target bit depth with dithering.

    When you don't use dithering, you can't keep precision in lower bit depth from color processing in higher bit depth and thus get banding. Generally dither to the same bit depth that is your display output. Temporal is more effective and should make dither patterns less noticeable (at least these are my findings with low dither/output depths), so try that first. madVR does temporal dithering by default too ("change dither patterns every frame").

    Edit: I think precision is fine with novideo_srgb (or that driver API that it uses), I don't notice any additional banding introduced in test gradient with dithering. I also think Nvidia driver dithering is fine, I don't notice any dither grain either with temporal dithering 8 or 10 bit (1440p 27" 144Hz).

    This btw. proves that dithering in the Nvidia Windows driver is fine, as there is no banding introduced after suspend etc. with novideo_srgb clamp + gamma correction active. It's just stupid Windows that ruins precision of GPU gamma ramps (when using Calibration Tools or DisplayCal VCGT loader, it's always bad when using the official Windows feature anyway).
     
    Last edited: Jan 8, 2022
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    nvidia internally handles at 12 or 13bits iirc.
     
  20. JaylumX

    JaylumX Master Guru

    Messages:
    614
    Likes Received:
    41
    GPU:
    MSI 3080 TI 12G
    fluidz likes this.

Share This Page