Dithering option?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Whiplashwang, Mar 3, 2018.

  1. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,453
    Likes Received:
    387
    GPU:
    RTX 4090 PNY
    I'm sure this has been brought up before, maybe to the point of abuse, but why doesn't Nvidia add a dithering option to the Nvidia control panel to help hide color banding? They seem to have the option on Linux OS and Quadro GPUs, but not for Windows and non-professional GPUs. Also, I've found a ton of threads online from Reddit to the official Geforce forums requesting such a feature and Nvidia just continues to ignore them. Some requests were made more than 3 years ago.

    So, again, why won't they add this feature?o_O
     
    BlindBison, Smough, joe187 and 3 others like this.
  2. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,453
    Likes Received:
    387
    GPU:
    RTX 4090 PNY
    BlindBison likes this.
  3. Xul Aethyr

    Xul Aethyr Active Member

    Messages:
    58
    Likes Received:
    7
    GPU:
    1080TI MSI Gaming X 2ghz
    they could add deband to freestyle(its in reshade) but it would be nice to have a dither option in control panel so it works with every game/desktop itself
     
    BlindBison likes this.
  4. Xtreme512

    Xtreme512 Master Guru

    Messages:
    793
    Likes Received:
    43
    GPU:
    RTX 4080 Super
    games have its own dithering to clear banding. there's no need for a setting for that. correct me if I'm wrong.
     

  5. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Try running this little 10bitdepth test, if working correctly the Nvidia driver will automatically dither the image.
    You HAVE to do this in a 8bithdepth panel, otherwise you get real 10bitdepth.
    https://forum.doom9.org/showthread.php?t=172128
    Why Nvidia doesn't wants to enable this for Geforce GPUs? who knows, the functionality is there, it just needs to be exposed to the users.
     
  6. Xul Aethyr

    Xul Aethyr Active Member

    Messages:
    58
    Likes Received:
    7
    GPU:
    1080TI MSI Gaming X 2ghz
    the nvidia driver does dither 10 bit sources in dx11 fullscreen only, if they had temporal dithering in the windows driver like they do in linux it would work with every source(amd also has it afaik)

     
  7. insp1re2600

    insp1re2600 Ancient Guru

    Messages:
    2,308
    Likes Received:
    1,086
    GPU:
    4080 FE
    looks like micheal j fox filmed that video
     
  8. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    That is cool! I've been using MPC-HC for years and never realized you could open a png directly in it, lol
     
  9. tayyar

    tayyar Member

    Messages:
    39
    Likes Received:
    17
    GPU:
    2
    I would love this.

    I'm gaming (and watch movies) on a TV at nights and dark scenes kill me.

    Reshades dithering pseudo-alleviates it but it introduces fuzzyness.
     
  10. RealNC

    RealNC Ancient Guru

    Messages:
    4,894
    Likes Received:
    3,168
    GPU:
    RTX 4070 Ti Super
    Some games do dither. Most don't. I appears that the vast majority of game devs today are actually completely unaware that they need to do this.
     
    BlindBison, Smough and joe187 like this.

  11. Xtreme512

    Xtreme512 Master Guru

    Messages:
    793
    Likes Received:
    43
    GPU:
    RTX 4080 Super
    probably. thing is this is very clear when game supports HDR, you continue playing on a normal monitor with HDR turned off but there is dramatic banding (for example RE:7).
     
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,645
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    Dithering basically emulates higher precision through noise, so it only makes sense when your input has higher bit depth than your output. But 99.9% of games (let's exclude those few HDR10/DV titles for now) pass 8-bit, so there is nothing to dither (it makes no sense to dither from 8-bit to 8-bit) and you can often output 10 or even 12-bit to the display anyway (which should have decent dithering to native bitdepth which might be 10+ bit as well).

    It sounds more like you want some "debanding" post-processing and then dithering (which is IMO a very bad idea in general for games).

    Back in the days when they started developing engines for DX11 I heard some rumors that presenters could now just pass the high-bit buffers to the display engine and have it dithered by the hardware to the precision of the link (8/10/12-bit depending on display capabilities and output settings), thus eliminating the need for truncating/rounding/dithering in software which sounded pretty cool but I don't think anybody does that these days (not even HDR10/DV games).

    Now with HDR10 and DolbyVision (though I don't fully understand how DV [with all those wonky tricks like layering and "tunneling" + all the redundant-looking proprietary mumbo-jumbo to obfuscate it all] works --in practice-- on PC, especially with games but whatever...) they obviously have to pass 10+ bit but I think they still limit HDR10 output to 10-bit (and truncate/round/dither in software), taking the HDR10 moniker literally (HDR10 = 10-bit and that is that --- which is a very "boxed-in" minimalistic way of looking at it because even FHD SDR could use and would of benefited of 10+ bit a decade ago with DeepColor displays and this closed-mindedness keeps going on into the UHD HDR era...). I hoped these HDR titles will finally "open the tap": pass 10+ bit (probably 16-bit in practice) to the hardware and let it dither (rather than truncate/round) according to the output settings (and thus make 12-bit output possible even for SDR, let alone HDR). But it seemingly took a rather different direction (and it's like they deliberately wanted to cut the possibility of 10+ bit SDR output like a way of forcing you to buy now displays if you want that).
     
    Last edited: Apr 5, 2018
    Noisiv likes this.
  13. ManuelG

    ManuelG NVIDIA Rep

    Messages:
    1,120
    Likes Received:
    617
    GPU:
    Geforce RTX 2080 FE
    Thank you for your feedback. Monitors from the past 5-7 years will all accept 8bpc input, even TN panels. Adding temporal dithering toggle for Windows would not help any current monitors. At this time we do not have plans to add this feature. I am sorry for the inconvenience this may cause.
     
  14. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,095
    Likes Received:
    2,601
    GPU:
    3080TI iChill Black
    Call of duty blackops2
    Steep
    Dirt rally
    Rift
    Elder Scrolls 5

    Just a quick example, and why wouldn't it help @ManuelG ?

    I tried sweetfx dither and it mild it a bit, still not ideal though.

    If the game uses shitty dither, im sure driver level would fix it further, then why did you or do you have this option in linux and by quadro gpus?

    You are wrong :p

    Thats all that needs to be corrected. I know at least 10games(newer) that are pagued by this, some more some less.. but still, I agree with OT we need this feature!
     
    Last edited: Apr 21, 2018
    Vidik likes this.
  15. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,384
    Likes Received:
    387
    GPU:
    RTX 3090
    We can only request and hope that in the future it is considered and you guys change your minds. I also agree with OP & others, adding dithering in the nVidia Control Panel would be a +

    It should not be enabled by default for any game, but we should have the option to enable it should we wish to do so.
     

  16. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Well until they add it (if they ever do) you're going to continue to see complains from users switching from AMD to NVidia and encountering a lower image quality with their brand new Nvidia GPU(if you don't know already, AMD does dithers )
    Btw current driver and Nvidia's HDR passthrough have banding while in fullscreen windowed mode 10bit, while FSE 10bit works fine no banding, this is a regression.
    Not a biggie since 8bit with dither actually looks very good even for HDR.
    flashbacks to that time when 6bit with dither often looked better than 8bit native and of course all those old directx9 and early games aren't just going to disappear.
     
    Last edited: Apr 22, 2018
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    16,998
    Likes Received:
    7,340
    GPU:
    GTX 1080ti
    Nobody is talking about the panel capability, they are talking about the fact that the LUT conversion introduces banding because Windows driver is clamping to 8bits where as the hardware itself (and on linux) 11bit is available.
     
    BlindBison likes this.
  18. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    Then please tell me why Nvidia X Server on Linux have option to turn on / off dithering ? Also why AMD support it by default if it doesn't help any monitors ?
    I can use Reshade Deband for any games and Potplayer + MadVR for video playback to "fix" color banding but NOTHING will fix it on general usage on desktop like viewing a simple picture or web browsing.

    Also those solutions are not perfect. Reshade Deband cost FPS and must be add to individual games (imagine if installing new Windows and have many games in library) and some games like Aven Colony can't use it. Potplayer + MadVR don't work with some sites and I should play video directly from Youtube like normal people rather than open with 3rd party programs.

    Add dithering to Windows driver is so hard when the code is already in Linux drivers ? You can leave it disabled by dafault and have the option to enabled when someone need it. You will make a lot of people happy by this way.
     
    Last edited: Aug 2, 2018
  19. Smartcom5

    Smartcom5 Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    Sapphire HD 3850 512 MB
    To be fair, that really would be nothing out of the ordinary, as nVidia is already known to feature¹ a way inferior overall image-quality, compared to ATi/MAD-cards – literally ever since.
    Inferior IQ is literally traditional on nVidia-GPUs since decades, so I'm actually a bit lost to see what would even change to the worse here in the first place.

    There are actually people which see it as a game to guess the card's vendor from random screenshots by just looking at it. And you know what?
    They're mostly right since its rather eye-catching and the differences are pretty obvious if you compare pictures from the same game on a AMD-card versus a nVidia-card side-by-side.

    ¹Mostly due to their texture compression techniques, which may have some side-effects every so often …
     
    Last edited: Aug 5, 2018
  20. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti

    Funny, you start out with 'to be fair' which is followed by complete AMD fanboy nonsense. Go on, cite links to your one on one comparisons.
     
    Smough, toyo and Noisiv like this.

Share This Page