Dithering option?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Whiplashwang, Mar 3, 2018.

  1. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,195
    Likes Received:
    43
    GPU:
    GTX 1080 G1
    I'm sure this has been brought up before, maybe to the point of abuse, but why doesn't Nvidia add a dithering option to the Nvidia control panel to help hide color banding? They seem to have the option on Linux OS and Quadro GPUs, but not for Windows and non-professional GPUs. Also, I've found a ton of threads online from Reddit to the official Geforce forums requesting such a feature and Nvidia just continues to ignore them. Some requests were made more than 3 years ago.

    So, again, why won't they add this feature?o_O
     
    Blackfyre and -Tj- like this.
  2. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,195
    Likes Received:
    43
    GPU:
    GTX 1080 G1
  3. Xul Aethyr

    Xul Aethyr Member

    Messages:
    42
    Likes Received:
    3
    GPU:
    1080TI MSI Gaming X 2ghz
    they could add deband to freestyle(its in reshade) but it would be nice to have a dither option in control panel so it works with every game/desktop itself
     
  4. Xtreme512

    Xtreme512 Master Guru

    Messages:
    638
    Likes Received:
    4
    GPU:
    GTX1060 6GB
    games have its own dithering to clear banding. there's no need for a setting for that. correct me if I'm wrong.
     

  5. VAlbomb

    VAlbomb Member Guru

    Messages:
    144
    Likes Received:
    4
    GPU:
    Nvidia G1 Gaming GTX 970
    Try running this little 10bitdepth test, if working correctly the Nvidia driver will automatically dither the image.
    You HAVE to do this in a 8bithdepth panel, otherwise you get real 10bitdepth.
    https://forum.doom9.org/showthread.php?t=172128
    Why Nvidia doesn't wants to enable this for Geforce GPUs? who knows, the functionality is there, it just needs to be exposed to the users.
     
  6. Xul Aethyr

    Xul Aethyr Member

    Messages:
    42
    Likes Received:
    3
    GPU:
    1080TI MSI Gaming X 2ghz
    the nvidia driver does dither 10 bit sources in dx11 fullscreen only, if they had temporal dithering in the windows driver like they do in linux it would work with every source(amd also has it afaik)

     
  7. insp1re2600

    insp1re2600 Master Guru

    Messages:
    235
    Likes Received:
    44
    GPU:
    EVGA FTW3 1080Ti
    looks like micheal j fox filmed that video
     
  8. Sajittarius

    Sajittarius Master Guru

    Messages:
    303
    Likes Received:
    5
    GPU:
    ROG Strix 1080ti
    That is cool! I've been using MPC-HC for years and never realized you could open a png directly in it, lol
     
  9. tayyar

    tayyar New Member

    Messages:
    7
    Likes Received:
    10
    GPU:
    2
    I would love this.

    I'm gaming (and watch movies) on a TV at nights and dark scenes kill me.

    Reshades dithering pseudo-alleviates it but it introduces fuzzyness.
     
  10. RealNC

    RealNC Ancient Guru

    Messages:
    1,731
    Likes Received:
    257
    GPU:
    EVGA GTX 980 Ti FTW
    Some games do dither. Most don't. I appears that the vast majority of game devs today are actually completely unaware that they need to do this.
     

  11. Xtreme512

    Xtreme512 Master Guru

    Messages:
    638
    Likes Received:
    4
    GPU:
    GTX1060 6GB
    probably. thing is this is very clear when game supports HDR, you continue playing on a normal monitor with HDR turned off but there is dramatic banding (for example RE:7).
     
  12. janos666

    janos666 Master Guru

    Messages:
    527
    Likes Received:
    7
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Dithering basically emulates higher precision through noise, so it only makes sense when your input has higher bit depth than your output. But 99.9% of games (let's exclude those few HDR10/DV titles for now) pass 8-bit, so there is nothing to dither (it makes no sense to dither from 8-bit to 8-bit) and you can often output 10 or even 12-bit to the display anyway (which should have decent dithering to native bitdepth which might be 10+ bit as well).

    It sounds more like you want some "debanding" post-processing and then dithering (which is IMO a very bad idea in general for games).

    Back in the days when they started developing engines for DX11 I heard some rumors that presenters could now just pass the high-bit buffers to the display engine and have it dithered by the hardware to the precision of the link (8/10/12-bit depending on display capabilities and output settings), thus eliminating the need for truncating/rounding/dithering in software which sounded pretty cool but I don't think anybody does that these days (not even HDR10/DV games).

    Now with HDR10 and DolbyVision (though I don't fully understand how DV [with all those wonky tricks like layering and "tunneling" + all the redundant-looking proprietary mumbo-jumbo to obfuscate it all] works --in practice-- on PC, especially with games but whatever...) they obviously have to pass 10+ bit but I think they still limit HDR10 output to 10-bit (and truncate/round/dither in software), taking the HDR10 moniker literally (HDR10 = 10-bit and that is that --- which is a very "boxed-in" minimalistic way of looking at it because even FHD SDR could use and would of benefited of 10+ bit a decade ago with DeepColor displays and this closed-mindedness keeps going on into the UHD HDR era...). I hoped these HDR titles will finally "open the tap": pass 10+ bit (probably 16-bit in practice) to the hardware and let it dither (rather than truncate/round) according to the output settings (and thus make 12-bit output possible even for SDR, let alone HDR). But it seemingly took a rather different direction (and it's like they deliberately wanted to cut the possibility of 10+ bit SDR output like a way of forcing you to buy now displays if you want that).
     
    Last edited: Apr 5, 2018
    Noisiv likes this.
  13. ManuelG

    ManuelG NVIDIA Rep

    Messages:
    575
    Likes Received:
    26
    GPU:
    Geforce GTX TitanX Pascal
    Thank you for your feedback. Monitors from the past 5-7 years will all accept 8bpc input, even TN panels. Adding temporal dithering toggle for Windows would not help any current monitors. At this time we do not have plans to add this feature. I am sorry for the inconvenience this may cause.
     
  14. -Tj-

    -Tj- Ancient Guru

    Messages:
    14,784
    Likes Received:
    401
    GPU:
    Zotac GTX980Ti OC
    Call of duty blackops2
    Steep
    Dirt rally
    Rift
    Elder Scrolls 5

    Just a quick example, and why wouldn't it help @ManuelG ?

    I tried sweetfx dither and it mild it a bit, still not ideal though.

    If the game uses shitty dither, im sure driver level would fix it further, then why did you or do you have this option in linux and by quadro gpus?

    You are wrong :p

    Thats all that needs to be corrected. I know at least 10games(newer) that are pagued by this, some more some less.. but still, I agree with OT we need this feature!
     
    Last edited: Apr 21, 2018
  15. Blackfyre

    Blackfyre Master Guru

    Messages:
    967
    Likes Received:
    18
    GPU:
    MSI GTX 1070 Gaming X
    We can only request and hope that in the future it is considered and you guys change your minds. I also agree with OP & others, adding dithering in the nVidia Control Panel would be a +

    It should not be enabled by default for any game, but we should have the option to enable it should we wish to do so.
     

  16. VAlbomb

    VAlbomb Member Guru

    Messages:
    144
    Likes Received:
    4
    GPU:
    Nvidia G1 Gaming GTX 970
    Well until they add it (if they ever do) you're going to continue to see complains from users switching from AMD to NVidia and encountering a lower image quality with their brand new Nvidia GPU(if you don't know already, AMD does dithers )
    Btw current driver and Nvidia's HDR passthrough have banding while in fullscreen windowed mode 10bit, while FSE 10bit works fine no banding, this is a regression.
    Not a biggie since 8bit with dither actually looks very good even for HDR.
    flashbacks to that time when 6bit with dither often looked better than 8bit native and of course all those old directx9 and early games aren't just going to disappear.
     
    Last edited: Apr 22, 2018
  17. Astyanax

    Astyanax Member Guru

    Messages:
    142
    Likes Received:
    26
    GPU:
    GTX 1080ti
    Nobody is talking about the panel capability, they are talking about the fact that the LUT conversion introduces banding because Windows driver is clamping to 8bits where as the hardware itself (and on linux) 11bit is available.
     

Share This Page