1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Dithering option?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Whiplashwang, Mar 3, 2018.

  1. Duamutef_MC

    Duamutef_MC Member

    Messages:
    13
    Likes Received:
    7
    GPU:
    GeForce GTX 1080
    Lies.

    Try this, so you can see for yourself.

    Go to lagom.nl and open the pave showing the black levels.

    Now try altering the gamma and contrast/brightness from the Nvidia Control Panel.

    Why doing that? Users owning the S2716DG or the Omen27 (both using the same AUO display) can achieve almost 1:1 fidelity and 2.2 gamma with these settings:

    - Brightness 40-45%
    - Contrast 55-60%
    - Gamma +0.80 to +0.85

    Once you've done that take a look at the black levels. Most of them would be wiped out by black crush. Take a look at the darkest one you can see. It will be your card RGB(1,1,1) instead of a dithered (0.5,0.5,0.5).

    Moreover, the color sequence visible in the same webpage would go from 1,2,3,4,... to something like 0,0,0,1,1,2,2,2,3,3,4,4,4,... - clearly approximated by integers and not dithered. This is my issue, as of now. I am aware Nvidia dithers fullscreen DX11 on GTX1080. But dithering in the main Windows environment is available only on Quadro cards, I imagine.

    Well - try this and see what I mean. It would be funny of after all this bashing we find out we agreed on the same thing since the very beginning. :)
     
  2. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    FUT honey. I started to crudely profile those sliders for you (using a display with a true 10 bit panel and >10 bit internal processing):

    1: they definitely alter the calibration LUT (clearing the LUT or loading a custom LUT with an independent LUT utility eradicates the slider's effects until touched again)
    2/a: low Brightness or high Contrast or low Gamma values cause black crush (dark grays turn into black)
    2/b: high Brightness values cause white crush (white grays turn to into white)
    2/c: low Contrast or high Gamma values result in elevated black levels (black and dark grays turn into bright grays)
    3: the ill-considered effects accumulate when using more than one sliders (thus a relatively minor adjustments on all of them are similar to a relatively heavy adjustment on one)
    4: a similar amount of banding can be observed with both 8 bit and 12 bit HDMI output formats (Full RGB)

    Now, about clipping/crush: If the LUT clips whites or crushes blacks internally, no dithering or extra precision will help with that. If the outputs are like 0,1,2,3,4,5 -> 0,0,0,0,0.1 then you have to see black crush. There is nothing to dither between 0 and 0, it's 0. And it looks like the sliders are either designed to do just that (deliberately cause clipping/crush) or the NVCP calculations result in low precision LUT values (in which case the output precision or dithering doesn't really matter). If the LUT has banding then you will see banding on a theoretical native 16 bit display as well.

    And about mid-scale banding: bumping up the precision of the output format or applying dither won't increase the original internal precision of the LUT. And similarly, temporal dithering doesn't yield virtually infinite effective precision. If you start from 8 bit and cut half the values (clipping on both ends + values left out from the table on he output side), you won't have effective 16 bit precision after temporal dithering (since the response is non-linear and the frequency is low, you can't use a wide range of values).

    But yet again remember we don't know the exact original intention of this black box. We can't look at the source code and there is no detailed documentation explaining all details (what's the exact target and how it's achieved).

    And remember: a real calibration LUT has the benefit of iterative construction: the software can try various device values and rule out the useless ones by measurements. Slider adjustments can't tell if some values are useless (e.g. if you display has some clipping/crushing/banding on it's own even before you touch the sliders or start the measurements). That's why a good calibration LUT will produce less ill side-effects than these kind of slider adjustments. The real calibration can try to fix some preexisting errors (for some degree) but these sliders can only add more.
     
    Last edited: Aug 6, 2018
  3. Enterprise24

    Enterprise24 Member

    Messages:
    18
    Likes Received:
    2
    GPU:
    1080Ti FTW3 S2716DG
    25 years ago, the alien nGreedia invaded our world. AMD stopped them.
    I stopped them.

    But the nGreedia corporation seized the opportunity, stepping into the power vacuum, quickly growing in size and influence to the brink of world domination.

    nGreedia took us all by surprise…even me…

    nGreedia showed me a glimpse of the future...

    nGreedia knows what drives me.

    What I believe.

    That having a high-end gaming PC comes down to one thing…
    ...to one single question: What are you prepared to sacrifice?

    When they came to me with the GTX 980 Ti and G-Sync monitor , I sacrificed Enterprise24 , the AMD fanboy I was, to become Prophet.

    Victory costs.

    Every time, you pay a little more.

    I saw a glimpse of what’s coming…
    …and there was nothing left of me to stop it.

    When the so-called greatest GPU company can't fix color banding on their own GPU...

    what do we do then? What do I do?!
     
  4. RealNC

    RealNC Ancient Guru

    Messages:
    2,599
    Likes Received:
    849
    GPU:
    EVGA GTX 980 Ti FTW
    I thought it was "NVidious."
     

  5. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    nVicious, nVinity ... it's all the same false dream, projected into our rasterizer-addled minds by those damn synthetic elves who poisoned our honest programmers, desecrated our holy designers, abused our company finances and then left all that fairy dust in the drawer. I ain't have no nothing to do about all that. The colors are all lies! The pixels are mere illusions!!! Can't you SEE IT??? Open your mind and look past those ghostly screens. The is no thing there!
     
  6. Enterprise24

    Enterprise24 Member

    Messages:
    18
    Likes Received:
    2
    GPU:
    1080Ti FTW3 S2716DG
    I just try Nvidia X server on Linux [first time Linux user because of Nvidia !!!]. In Windows when you change gamma via software such as ICC profile or gamma setting in Nvidia drivers you will get color banding.

    I have my usual ICC profile which itself can reduce banding by 80 percent or so with full dynamic range but in exchange I got slight black crush [cost 3 squares in Lagom.nl black level test]. I then eliminate banding entirely without black crush by limited dynamic range + vibrance trick that I refer before. The downside is I must stay with 391.35 forever since it no longer work with newer drivers. However there is very small downside since black will looks slightly grey-ish [think like black on IPS] but this is worth much more than banding + black crush.

    Now with Linux I just go to Nvidia X server then set dithering to enabled then set method to temporal and select my ICC profile and leave dynamic range to full then I open my favourite picture "powered by dawn engine" and BOOM. No color banding , No black crush , No slight grey black.

    Linux is still hard for me as a first time user. I just hope that this option will present in Geforce drivers....

    PS. I just try static 2x2 and dynamic 2x2 methods also. All work great and I can't tell the different between static / dynamic / temporal. I also check black crush and gradient on Lagom.nl and found no problem.
     
    BahamutxD likes this.
  7. SpookySkeleton

    SpookySkeleton Member Guru

    Messages:
    116
    Likes Received:
    16
    GPU:
    GTX 1080
    they just can't fix it or they don't want?
     
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    15,625
    Likes Received:
    243
    GPU:
    EVGA GTX 1080 Ti SC2
    Quick question: Do AMD offer a colour dithering option in their Windows driver?
     
  9. Astyanax

    Astyanax Master Guru

    Messages:
    727
    Likes Received:
    151
    GPU:
    GTX 1080ti
    No, its always on and hence a source of migraines and horrible for artists who want their stuff to look right.
     
  10. kosta20071

    kosta20071 Member

    Messages:
    44
    Likes Received:
    16
    GPU:
    Quadro 4000m
    Too bad there is no such option. But , If you set RGB\YCBCR 4:4:4 8bit and turn on Windows HDR, you can see in windows display settings that it says 8-bit with dithering . I guess windows adds dithering maybe , but it only shows '.. with dithering' when using 8bit and as described above

    [​IMG]
     

  11. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    2,965
    Likes Received:
    21
    GPU:
    RTX 2070 XC Ultra
    HDR uses 10 Bit color, so on an 8 Bit panel it makes sense that it would use dithering to use HDR to fake 10 Bit. Similar to 6 Bit panels using Dithering to fake 8 Bit.
    That might be a nice side benefit to using HDR, but that's not what's being talked about here. In a non HDR supported game banding will still be present on your monitor.
    Whether that also helps you at the windows desktop, I'm not sure. I doubt it though, from what I've heard HDR functionality when simply at the desktop is a nightmare, so I'm guessing you'll have other more pressing issues to contend with beyond dithering at that point.
    It's sort of the difference between a hardware and software implementation. What you are talking about being hardware side (to add HDR support), and what is being asked for here is a global software side.
     
  12. Enterprise24

    Enterprise24 Member

    Messages:
    18
    Likes Received:
    2
    GPU:
    1080Ti FTW3 S2716DG
    This is interesting... Instead of asking for dithering maybe we should ask for 11 bit LUT on GeForce drivers.
    https://www.reddit.com/r/nvidia/comments/a234yq/any_news_about_adding_dithering_support_to_the/


    Then why did the dithering option in linux improve things dramatically

    Nvidia's Linux drivers allow support for an 11-bit LUT, which is then dithered down to the 8/10-bit display. Their Windows drivers clamp to 8-bit, so there's no dithering. You get an image that is perfectly matched to the display, for better or for worse.

    And why is it when you connect an AMD Card to the same monitor you rarely see any banding ?

    AMD's consumer drivers allow for higher bit-depth support, and will dither down, similar to Nvidia's Linux drivers. Nvidia's 10-bit support is atrocious, and it's a crap shoot if it will even detect and let you use this option. My wife's Apple Cinema Displays properly shows up as 8-bit on an RX 480, and it is truly an 8-bit only panel, but shows up as 10-bit on a GTX 1060. I've had actual 10-bit panels show up as 10-bit (or even 12-bit due to their LUTS) on AMD, but 8-bit on Nvidia.
     
  13. Chastity

    Chastity Maha Guru

    Messages:
    1,326
    Likes Received:
    259
    GPU:
    Nitro 390/GTX1070M
    Just to point out, a couple of members in this thread, and another, pointed out that AMD always dithers, and iirc someone said that this was a travesty to any PC artists endeavoring to get proper color matching. AMD drivers does support a dithering toggle in registry, to enable or disable dithering:

    Code:
    Windows Registry Editor Version 5.00
    
    [HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}\0000]
    "DP_DisableDither"=dword:00000001
    Just a FYI to clarify options. I use this to disable dithering on my 10bit display.
     
  14. kosta20071

    kosta20071 Member

    Messages:
    44
    Likes Received:
    16
    GPU:
    Quadro 4000m
    In Nvidia drivers (inf files) there are some reg keys concerning dithering (usually they are not created when installing) :
    DitherAlgo6
    DitherAlgo8
    DitherToBpcAlgo
    NewDefaultDitherPattern

    Maybe worth checking different combinations . usually DitherAlgo8/6 appear with value "3" (DWORD32) in certain inf files .

    Computer\HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Video\{2D2A7876-F8BF-11E8-A4F6-848B33DFC0AC}\0000
     
  15. Guzz

    Guzz Member Guru

    Messages:
    125
    Likes Received:
    11
    GPU:
    GTX 970

  16. kosta20071

    kosta20071 Member

    Messages:
    44
    Likes Received:
    16
    GPU:
    Quadro 4000m
    WOW I think it works! the problem is that you cannot create this registry because of permission or this path being used by the system. So, I had to
    1. open c:\windows\regedit.exe with the tool :Give Me Power (GMP)
    https://www.wagnardsoft.com/forums/viewtopic.php?f=6&t=6

    2. add binary registry "DitherRegistryKey" and write the value manually .

    I added the 10bit dithering and checked if dithering was working well with this 10bit gradient test pattern:
    https://www.avsforum.com/forum/139-...it-gradient-test-patterns-2.html#post40415330

    I switched to YcBcr 10bit 4:2:2
    AND IT WAS SMOOTH AS NEVER BEFORE!!! its actually seems to work!
     
    Sajittarius likes this.
  17. n3v3rm1nd

    n3v3rm1nd Member

    Messages:
    49
    Likes Received:
    3
    GPU:
    Aus Strix RTX 2070
    Could add without any issues, weird. But it really works. I used it on my IPS panel which I think is useless but I have less banding. Using the Temporal settings. Anyone know what each dithering setting does and what the differences are??
     
  18. M3TALG3AR3X

    M3TALG3AR3X Member

    Messages:
    38
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980Ti Xtreme
    Could you provide a screenshot from your regedit and the binary open, i can´t make it work, maybe im doing something wrong.
    I have tried 8 and 10bits and nothing.

    thank you
     
  19. Vidik

    Vidik Master Guru

    Messages:
    441
    Likes Received:
    60
    GPU:
    MSI 1070 Gaming Z
    Better try in now before nvidia "fixes" it:rolleyes:
     
  20. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,221
    Likes Received:
    79
    GPU:
    MSI GTX 1080Ti
    I tried it. Unfortunately I didn't see any difference. Tried multiple settings and reset my driver each time with CRU. Was testing with the 10-bit gradient mp4 posted above and it looked the same before and after.
     

Share This Page