Dithering option?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Whiplashwang, Mar 3, 2018.

  1. kosta20071

    kosta20071 Member Guru

    Messages:
    126
    Likes Received:
    108
    GPU:
    Gigabyte RTX 3080
    I was on modified quadro driver ... I used :

    ditherState - Enabled; ditherBits - 10 bit; ditherMode - SpatialDynamic2x2
    1. "DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,02,02,f2,00,00,00
    I take back what I wrote earlier,,, when I did a reset with CRU , Nvidia switched from YCBCR 4:2:2 10bit to YCBCR 4:2:0 8bit (I didn't notice I was on 4:2:0 ) and It did look great , totally smooth but when I switched back to other modes, the pattern looked fine as without the registry tweak ,,, somehow 4:2:0 8bit smoothes the banding...

    I will remove my post
     
  2. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    Well, i did notice in that geforce forum post:

    'with "Full" dynamic range dithering is disabled, but with "Limited" range dithering is enabled with "Temporal" mode.'

    It would make sense that, in 4:2:0, it smooths banding; since ycbcr is always Limited range (so it would use dithering, according to that post). I might still try this on Full Range RGB myself
     
  3. SpookySkeleton

    SpookySkeleton Member Guru

    Messages:
    153
    Likes Received:
    23
    GPU:
    RTX 3090
    It worked for me once but after restar it went back to default and i wasn't able to get it work again
     
  4. Xul Aethyr

    Xul Aethyr Active Member

    Messages:
    58
    Likes Received:
    7
    GPU:
    1080TI MSI Gaming X 2ghz
    no difference for me, tested with lagom gradient
     

  5. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    Does it affect performance ? Absolutely not.

    Test system.
    8 bit temporal dithering was used in this test.
    i7-8700K @ 5Ghz core & 4.8Ghz uncore
    ASRock Z370 Taichi P4.00
    2x8GB DDR4-3500 16-18-18-36-2T
    EVGA GTX 1080 Ti @ 2126 core / 12474 mem
    Corsair HX 750W
    NZXT H440 White
    Custom Water Cooling
    Windows 10 64 bit 1607
    Nvidia 430.64

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

     
  6. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    Full guide.

     
    signex likes this.
  7. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    Many people ask me about windows 10 1903 on reddit / Youtube / Geforce Forums.
    I will repeat again please avoid dithering if you use win 10 1809-1903. It give harm more than good.
    1703-1803 is OK if you are willing to deal with some issues.
    For flawless experience Windows 7-8-8.1-10 upto 1607 is best. But this is NOT possible for RTX.
    Turing user I feels so sorry. If you need dithering desperately then maybe consider sacrifice DX12 + RTX by installing Win 7 instead.
     
  8. Memorian

    Memorian Ancient Guru

    Messages:
    4,017
    Likes Received:
    883
    GPU:
    RTX 4090
    Or buy a native 10bit panel..
     
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,351
    GPU:
    GTX 1080ti
    it will still have banding on less than 10bit images.
     
  10. Memorian

    Memorian Ancient Guru

    Messages:
    4,017
    Likes Received:
    883
    GPU:
    RTX 4090
    And compressed YT videos will show banding, no matter what..What's your point ? Users want a dithering option because they use 8bit TN panels.
     

  11. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    A lot of people use 8bit TN panels. Don't be a snob.
    IPS would also benefit. Besides, if the argument is dithering only helps cheap displays, why is this option being locked out on GPU's where cheap monitors are more likely to be used?
     
    Last edited: Sep 13, 2019
    yasamoka, Enterprise24 and CrazyGenio like this.
  12. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Dithering can be distracting or in some cases cause loss of low frequency detail (Even BDs and UHD BD can still have it. Ugh)but yeah the option should be there.
    My main monitor has some kind of temporal dithering built in and it can look weird on shades of blue and orange.

    I'd still rather have it and the option.
     
  13. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    And still suffer from banding AFTER calibration.

    Just bought RX 580 today. It doesn't need fancy registries hack. Dithering come automatically AFTER applying ICC profile.
     
  14. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    I know some people suffer from migraine from temporal dithering. All that we want is just option. Disable by default is extremely welcome. Don't need to force on like AMD.
     
  15. Enterprise24

    Enterprise24 Active Member

    Messages:
    54
    Likes Received:
    13
    GPU:
    1080Ti FTW3 S2716DG
    Is it possible for @Hilbert Hagedoorn to do gradient test on AMD vs Nvidia after calibration ? I am a no name Youtuber and no one probably care. Nvidia may listen to you however.
     

  16. Memorian

    Memorian Ancient Guru

    Messages:
    4,017
    Likes Received:
    883
    GPU:
    RTX 4090
    Captured from AMD V56. If the texture is crap, nothing can save you from color banding. It looks exactly the same on my U32H850(Native 10bit panel).

    [​IMG]
     
  17. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
  18. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    Most people sadly have 6bitrash or 6bit+2frc monitors to catch simulated 8bit. Those are also mostly TN. Cheap monitors like those deserve to have banding. At least there are true 8bit ips/VA panels really cheap.

    Today in 2018+, everyone should try have at least a pure 8bit panel, preferably 8bit+2frc to simulate 10bit.

    But, only with

    true 10bit/12bit you get paradise of vision. In those only you can get the rec2020 playback and support. There is no place for color banding and inaccuracies.
     
  19. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Another snob.
     
  20. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    The vast majority of display accept an 8-bit input. The vast majority of 10-bit displays use 8-bit + FRC panels. People aren't going to buy professional 10-bit native panel displays at obscene prices to play games or do non-color-critical work requiring a color-aware workflow. Almost none of these monitors operate at higher than 60Hz and response times are usually inadequate for motion-based tasks.

    Temporal dithering helps remove the banding seen when a color profile is applied and a 1D LUT is loaded (grayscale calibration). Even on an 8-bit display, with a color profile, you can see AMD's dithering eliminate banding on the gradient @ lagom.nl/lcd-test/gradient.php while Nvidia GeForce graphics cards on Windows do not.

    As someone else said here, don't be a snob. Don't offer impossible demands that cannot / should not be met by most people. I'd rather save bandwidth and get a higher refresh rate at 8-bit than drive a 10-bit display showing 8-bit content, only to reduce banding.
     

Share This Page