Color Gradient Banding Driving Me Crazy

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Size_Mick, May 28, 2019.

  1. DerSchniffles

    DerSchniffles Ancient Guru

    Messages:
    1,665
    Likes Received:
    148
    GPU:
    MSI 3080Ti
    Oh this topic. Man, I have been trying to find a solution FOREVER. It would seem that some people are just more sensitive to it than others, much like screen tear and input lag. But man, it bothers me a lot. Also, be aware that calibrating can also cause MORE banding if the software has to adjust your gamma a great deal. I use DisplayCal and using the "Report on uncalibrated monitor" feature has been paramount to reducing banding. My monitor has multiple gamma settings, in which are very far off from the actual gamma setting. I found that the 2.2 gamma setting on my monitor actually yields a 1.9 gamma, which was very, very washed out. My monitor performs best at a 2.4 gamma setting which was the "gaming" setting on the monitor OSD (right above 2.5 gamma) and according to DisplayCal, that setting hit 2.39 gamma uncalibrated which means it had very little adjustment to make during calibration (for gamma). The more adjusting software calibration has to do, the more banding that can be introduced so keep that in mind. A properly calibrated monitor using the correct parameters is paramount in perfecting picture- zing!
     
    -Tj- and BetA like this.
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Well, the HDR10 banding issue appears to be related to the latest Windows 10 build (1903) and affects both nVidia and AMD pretty much the same way. Even though the visible end result is not entirely the same, the root cause probably is: 10bit output seems to be converted to 8bit for some reason, no real native 10bit output is possible (AMD probably looks better because dithering is always on but the and nVidia driver sees no reason to turn it on, so it looks worse).
    https://forum.doom9.org/showpost.php?p=1875734&postcount=310
     
    BetA likes this.
  3. BetA

    BetA Ancient Guru

    Messages:
    4,537
    Likes Received:
    518
    GPU:
    G1-GTX980@1400Mhz
    Thanks for coming back and sharing the INFO..
    :D
     
  4. Size_Mick

    Size_Mick Master Guru

    Messages:
    630
    Likes Received:
    463
    GPU:
    Asus GTX 1070 8GB
    Holy mother of god, I just wanted to get rid of color banding. WHY IS THIS SO COMPLICATED!!!! I think we need to start some sort of campaign to get manufacturers and developers to start paying more attention to this themselves, so we don't have to d1ck around with sh1t like this all day. Actually, nevermind, this whole thing is starting to give me a headache. Thanks for all the help though, guys. Maybe after a few days I'll give this more attention and try to delve into the problem further.
     

  5. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    How about, instead of bitching, you actually respond to the questions asked of you in this thread?
     
  6. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,883
    Likes Received:
    667
    GPU:
    ASUS RTX 4090 TUF
    I can't even set 10 bpc output in Nvidia Control panel with 1903 anymore. The option is set to 8 bpc and greyed out. Sad!

    Update: Looks like I can set it again using 435.27 drivers, whereas it didn't work for 430.64.

    And yes, I have horrible banding with HDR enabled, even though I have set the Desktop to 10 bit HDR.
     
    Last edited: Dec 27, 2019
  7. Chastity

    Chastity Ancient Guru

    Messages:
    3,744
    Likes Received:
    1,668
    GPU:
    Nitro 5700XT/6800M
    I can still set it here, monitor is connected via mini-DP to DP cable
     
  8. gerardfraser

    gerardfraser Guest

    Messages:
    3,343
    Likes Received:
    764
    GPU:
    R9 290 Crossfire
    Colour banding has been around forever especially on lower bit panel and type of panel . So even changing cards may not make a difference because of lower quality panels and source material could just be bad ,like crappy panel and 6 bit source material or low quality material .

    I suggest borrow AMD card and see if it makes a difference. AMD is just better when it comes to this banding issue and eliminates this problem.

    Nvidia is not good dealing with Colour Gradient transitions.
    Have fun with your campaign but you can join the movement to have Nvidia implement stuff in there drivers ,which Nvidia does in Linux but not Windows and Nvidia also says it will not help in Windows environment.


    Works fine here on 1903 with Nvidia 10Bit 430.86 Driver

    https://i.**********/8cWPm0Yq/10bit.png
     
    joe187 likes this.
  9. joe187

    joe187 Master Guru

    Messages:
    495
    Likes Received:
    22
    GPU:
    EVGA RTX 3070ti FTW
    Yeah, there was a whole thread about this over on nvidia support forums that i would check on now and then, but personally it just gave me a headache too eventually. IIRC, people really wanted that dithering option on windows cause the linux driver had it, and nVidia basically said, nope not gonna do it!
     
  10. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,883
    Likes Received:
    667
    GPU:
    ASUS RTX 4090 TUF
    Hm, odd, it didn't work for me on HDMI for both the TV and monitor with 430.64. I switched to a clean (DDU) install of 435.27 drivers and it's working fine on those. Then again, I don't think I did a fresh video card driver install after "upgrading" from 1809 to 1903 and the 430.64 drivers carried over, so perhaps something got messed up during the process. So nevermind, I guess the HDR banding issue isn't related then.
     
    Last edited: Jun 2, 2019

  11. artina90

    artina90 Member Guru

    Messages:
    148
    Likes Received:
    58
    GPU:
    RTX 3080Ti
    On my 27UK650 I fixed banding by uninstalling the ICC profile that came with the monitor drivers and doing a full reboot.
    I also noticed that nightlight is broken, I suspect there might be something fishy going on between the NV driver and the "display enhancement service".
    I know that disabling this service breaks nightlight but now even if I re-enable it, nightlight doesn't work.
     
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Yes, there seems to be a connection to color management (with the crazy bad dithering which looks like precision drops from 8bit to 6bit without dithering --- not the regular, "normal" 8bit banding which is always present in some games by design).
    Disabling the LUT Loader service made it go away on my desktop PC (nVidia VGA and one of the displays had an ICM profile): https://forum.doom9.org/showpost.php?p=1877096&postcount=400
    Although I saw this triggering with no ICC profiles at all on my AMD laptop where it just randomly stopped happening between driver re-installs (without any kind of tinkering other than running DDU).
    I never use Nightlight (not the built-in one, nor Flux).
    So, it's still not so clear of a case. Anyhow, I eventually disabled the LUT Loader on both machines for the time being.
     
    Last edited: Jun 16, 2019
  13. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    a lot of people are contributing here, i just want to add my 2 cents to help clarify certain things i noticed that may confuse people:

    10 bit color / Chroma Subsampling
    4k 10-bit 60hz RGB over HDMI is not possible for nvidia with HDMI 2.0. For RGB or ycbcr444, you will only get the option for 8-bit over HDMI for 4k@60hz. To get 10-bit color at that res, you need to use ycbcr422 or ycbcr420 (mine does 8/12 bit for 420, 8/10/12 for 422).

    (Displayport is different, I wont say which one supports 444 subsampling because i am not sure.)

    Another thing to check, especially with TVs, is whether you are actually getting the RGB fully displayed without Subsampling. On my Samsung KS8000, it will only show 4k 60hz with 444 subsampling when the input is set to PC mode. Unfortunately, this screws up HDR handling (plus disables a bunch of contrast/color options), so i generally use Game mode. This ends up converting 444 signals to 422 signals. There is a test pattern for this; if you google chromares png you can find it (its like fuchsia/purple looking, hard to miss), here is a link to it from the madVR forum also:

    https://forum.doom9.org/showthread.php?p=1673855
    Super important: if you view this image with ANY scaling, it will display wrong. I use irfanview and make sure it is set to 100% to view. Browser almost invariable screw this up because they are scaling crap without telling you (especially at 4k, maybe less if you are at 1080p and using 100% DPI)

    MadVR
    Also, in the Display options in madVR, there are options to tell madVR the bit depth of your display (might help, but yea you may be limited by Windows/Nvidia handling of colors and lack of dithering). Also, the newer builds of madVR (i am using 0.92.17) have much better HDR support than older builds (i upgraded and HDR movies suddenly started working ALOT better, working even when not in fullscreen in Windows 1903), if you are not on the latest madVR i highly recommend it.

    8-bit HDR/Win10 HDR in general
    In the newer Windows (i want to say 1809 or 1903), they added 8-bit HDR. It is basically 4k 60hz 444 subsampling (either RGB or ycbcr444 in the nvidia options), and the 10 bit color w/HDR is displayed as 8 bit with dithering. It even shows in Windows settings under Display-> Advanced Display Settings -> "Bit Depth" and will say "8-bit with dithering". Previously, if you had your video set to 8-bit, Win10 would not enable HDR. It would only enable when using 10 or 12 bit color (like 422 subsampling with 10-bit color).

    Also, I still don't think Microsoft has the HDR thing figured out in general when it comes to the desktop in windows (which is odd because the Xbox One X does it pretty well). When switching to HDR, the desktop looks much less colorful, like they are switching to a bt2020 gamut but not translating it correctly, which makes SDR stuff look terrible. It might be better on OLED screens, but it is pretty bad on my LED TV (almost like the limited/full range RGB setting got switched by accident).
     
  14. BetA

    BetA Ancient Guru

    Messages:
    4,537
    Likes Received:
    518
    GPU:
    G1-GTX980@1400Mhz
    Regarding BANDING...

    from the new Nvidia Driver 431.36 (PDF)


     
    Sajittarius likes this.
  15. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    nice, maybe this will also help fix the way the desktop looks weird when HDR is enabled, lol
     

  16. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    When you enable HDR in Windows 10, it sets RGB limited and disables Nvidia settings
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,380
    GPU:
    GTX 1080ti
    DisplayCal has implemented a workaround for 1903 banding
     
  18. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    I switches to RGB limited but does not disable RGB full, you can still switch to HDR and set RGB Full if you want. (at least i can on my pc in Win10 1903, and also worked in 1809 at least). Dynamic contrast also still works in the nvidia control panel.

    Screenshot from my pc:
    https://imgur.com/Gbi9Bc5

    if you enable HDR, and then set it to Full RGB after, stuff looks way less washed out (probably because windows is switching to BT2020 and the full RGB is helping to counter the washed-out-ness, but i think it's really just exaggerating whites+blacks to achieve that effect)

    edit: i forgot to mention i am on 435.27 drivers, maybe they have better HDR support in the 435 branch?
     
    Last edited: Jul 27, 2019
  19. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    This week I discovered (after using the wrong settings for almost 2 years) that changing the color format from RGB to YCBR444 took away all the color banding and gave me perfect full range HDR+ image on games and perfect windows desktop experience. Hence the default RGB from nvidia is bad for my Samsung 4k tv. Of course, TV is set to PC mode on its own config.
     

Share This Page