Color Gradient Banding Driving Me Crazy

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Size_Mick, May 28, 2019.

  1. Size_Mick

    Size_Mick Master Guru

    Messages:
    630
    Likes Received:
    463
    GPU:
    Asus GTX 1070 8GB
    Not sure where this question really belongs, so I chose this section on a guess. But since I'm not really sure about the causes or what to do about them, I figured I'd post here.

    It seems that I just can't escape color banding. It's in my video games (some, not all) and in videos I watch (hulu, prime, netflix). I thought they solved this sh1t back in the 3dfx Voodoo days. WHAT IS WRONG WITH THE UNIVERSE???

    I know I can monkey around with post-processing in my media players, with mixed results. But is there anything I can do about games? Thanks for any advice. I'm running whatever driver from the last few months.
     
    joe187 likes this.
  2. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    It would help if we knew what monitor you're using. That's for starters
     
  3. BetA

    BetA Ancient Guru

    Messages:
    4,526
    Likes Received:
    479
    GPU:
    G1-GTX980@1400Mhz
    For exactly this issue i use ReShade.
    It has an deband shader also, wich works, really, really goood. kinda like madshi´s MadVR for videos.

    Heres ReShade and some infos on it.
    https://reshade.me/

    And here is the Deband Shader (with Preview Pics inside :)
    https://reshade.me/forum/shader-presentation/768-deband?limitstart=0


    quote:


    edit:
    Screens:
    ORIGINAL
    [​IMG]


    ReShade + Deband Shader
    [​IMG]


    ORIGINAL
    [​IMG]

    ReShade + Deband
    [​IMG]




    ReSHade works with allmost any Game. DX8(with special DLL), DX9, DX10, DX11, OpenGL and also DX12 now.
    The Deband shader has virtually NO PERFORMANCE impact at all, at least on my end.
    And its easy to use also :)


    Greetz



    ps:
    For videos you could allways use MadVR for the best quality, but if this is too much hassle, then an good solution for debanding in videos is to use MPC HC or MPC BE with ONE shader enabled, i can give you my deband shader, just activate with mpchc or mpcbe and enjoy good debanding with minimal or no detail loss..

    You can download it here:
    https://anonfile.com/sbC2f9tena/_Debanding080-16-48_hlsl

    you can change the settings with an editor of your choice, but they should be spott on..
     
    Last edited: May 28, 2019
    emusa, -Tj-, Size_Mick and 1 other person like this.
  4. CrazyGenio

    CrazyGenio Master Guru

    Messages:
    455
    Likes Received:
    39
    GPU:
    rtx 3090

  5. joe187

    joe187 Master Guru

    Messages:
    494
    Likes Received:
    22
    GPU:
    EVGA RTX 3070ti FTW
    I feel you, this also has bugged me for a long time in some titles. Agreed, it's 2019 ffs, this shouldn't even be a thing.
     
  6. jaju123

    jaju123 Guest

    Messages:
    355
    Likes Received:
    3
    GPU:
    2x AMD R9 290 Crossfire
    I feel like the only way this will be solved is when everyone has 10-bit proper HDR displays and the same 10-bit colour enabled in games, etc.
     
  7. janos666

    janos666 Ancient Guru

    Messages:
    1,648
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    10+ bit SDR became available around late 2008 on some consumer displays (the likes of high-end HDTVs). And I am not exactly sure which was the first consumer GPU+API which allowed this but it's a mandatory feature for DX11 and that debuted in late 2009 (so, roughly around the same time, some 10-11 years ago). It's just that virtually nobody ever cared to utilize it.

    And, ahhh.. they still manufacture IPS-like (thus supposedly "one of the better quality ones") LCD panels with 6 bit + cheap internal dithering (the last "premium grade" notebook I bought have one of those but there are far worse notebook displays that this), so yeah...

    And manufacturers seem to be relentless about pushing the resolution above all else these days while the interfaces are lagging behind. (IMO, something like ThuderBolt or SFP+ [some pre-existing fairly-generic bi-directional data interface with standardized optical cable solutions] should have replaced HDMI and DP [with optical cables being the suggested default choice for AV applications above 2 meters] by now.) We would probably be looking at ~80Gbps in 2020 (and could have ~40Gbps since 2016 or so, up to 100 meters with affordable optical cables instead or over-hyped over-priced HDMI copper magic cables which die around 10 meters + uninterrupted CEC, plenty of bandwidth for sound in both ways, data channel for USB 3.0 hubs on the display device side, etc).
    -> As a result, while starved for bandwidth, precision suffered and the legacy leftover chroma subsampling (YCC below 4:4:4) caved itself in even deeper and stronger into "modern" digital media than ever.
    At least the 60->120Hz leap (as a new default) managed to happened around the end of the UHD/4k era...

    @BetA
    I really don't like how much "grain" that filter adds to the otherwise deliberately "crystal clean" look of that second game (Mirror's Edge?). It clearly does not "recover" anything (which is technically impossible but I talk about how it seems to appear) but merely tries to mask the banding with grain/noise. Now it's clearly grainy (after starting from completely clean) while the banding is still mostly there.
    "Perlin noise" (and similar) is common practice with movies as well, especially when the director wants the 1990's looks with top-notch cameras of 2010's or when they try to hide the mediocre CGI quality and how that blends in with live recordings.
     
    Last edited: May 28, 2019
  8. BetA

    BetA Ancient Guru

    Messages:
    4,526
    Likes Received:
    479
    GPU:
    G1-GTX980@1400Mhz
    you can configure the Grain option also, it all depends on how you like it..
    So, you could just decrease the grain and its gone.. ;)
     
  9. vf

    vf Ancient Guru

    Messages:
    2,184
    Likes Received:
    306
    GPU:
    ATi Radeon™
    Crysis has terrible banding.
     
  10. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Low precision without any solution to blend it = banding. What do you expect eh?
    There have been solutions for developers to use to clean up banding with lower precision rendering for a long time when outputting in 8-bit SDR. But seldom do they do it for all effects and buffers.
    Some TVs and displays have a built in debanding. I have one Sony TV has an excellent adjustable debanding effect. Though you have to be careful as you can scrub away too much low frequency information and lose detail in dark areas.

    With Videos it's just poor encoding and get used to it since all have proclaimed that the future is streaming! And that physical media is dead. Streaming means they are always starved for data and thus video bit rate. Even at 4k.
    You can get excellent results with 8-bit video but that's down to how it's encoded. 10-bit is of course better and has been used in "the scene" for video encodes for things like anime for a long time now.
    MadVR as mentioned has a pretty good debanding effect for different situations. I use it often.


    Then there are the displays, that's a can of worms too. Especially with temporal dithering used in 6-bit panels. My main monitor, the HP25 ER is great aside from the fact i'm pretty sure it is actually 6-bit+FRC. As dithering can be pretty apparent in motion on blue and orange colors. And the same colors static always have some kind of visible artifact.
     

  11. NeoandGeo

    NeoandGeo Guest

    Messages:
    745
    Likes Received:
    9
    GPU:
    Geforce GTX 970 @1420
    OP: This thing is driving me crazy!

    Us: What exact thing are you talking about? Give us an example of what you're referring to.

    OP: Disappears.

    For all we know OP could be talking about actual color banding, chroma issues or any color issue. The fact that they say it happens in everything involving media/games also throws a wrench into what exactly they mean.

    What monitor/TV is in use? Do RGB/YCbCr #:#:# options come into play? Has it always happened? Did this start after installing a specific version of Windows? Pictures or screenshots demonstrating the issue? What post-processing options seem to alleviate the issue for you?

    ContextPlz
     
    Last edited: May 29, 2019
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,648
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    By the way, I don't mean to hijack the thread but definitely would not open a separate "yet another banding thread" either, so I will use this one.:cool:

    I started to suspect some quality problems with MPC-HC + madVR, especially considering HDR10 playback (but something barely noticable in 8bit SDR probably appears much more significant in 10bit HDR). So, I looked at a synthetic gray-scale ramp HDR10 1000 nit test pattern. It looked rather strange, so I checked the same test video using the TV's built-in player and that looks silky smooth (within realistic expectations). The PC picture looks the same with NVAPI HDR and Windows 10 HDR (wither fullscreen windowed or exclusive - where applicable).

    Now I left wondering if this is yet another nVidia HDR driver problem (we had plenty of those ever since I bought my first HDR display ~2.5 years ago) or a TV firmware issue (could be a bug affecting the HDMI input but not the internal sources).

    The PC currently runs with Win 1903 (CU .145) and 430.86 (one would probably assume a "studio ready" version has no such blatant problems - unless it's not a generic HDR problem but something madVR specific).

    The difference is apparent even on fairly low quality 8bit pictures taken with a smartphone camera (so it's obviously more apparent with the naked eye):
    https://prohardver.hu/dl/upc/2019-05/35956_img_20190529_205041.jpg
    https://prohardver.hu/dl/upc/2019-05/35956_img_20190529_205000.jpg
     
    Last edited: May 29, 2019
  13. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    Is the TV panel native 10bit or 8bit+FRC? I have the same banding issue when using HDR on my TV and also you can't realistically expect native 10bit content on a TV at 4K unless you're fine with 30Hz; it's a hardware limitation of HDMI 2.0 and the software has to take that 10-bit image, downgrade it to 8bit and then do some dithering to fit the bandwidth.
     
  14. janos666

    janos666 Ancient Guru

    Messages:
    1,648
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    It's a 10bit WOLED panel (LG 55C8). The low level analog-digital factory calibration is something like 11000 Kelvin white (the RGB subpixel set is adjusted to the nominal native White subpixel color temperature as the basis point) with gamma 2.2 (yeah, that's a little strange but that's what it is), then digitally calibrated to 6500K and gamma 2.2 (with an RGB 1DLUT) and the HDR10 format is mapped to this with real-time processing based on the metadata.
    So, yes and no. It's 10bit but I guess a lot of values are wasted during the digital mapping, so some internal dithering is probably necessary to avoid visible banding on 10bit sources (and dithering is probably mandatory for them to support the 12bit DolbyVision format). The native refresh rate is 120Hz, so seamless temporal dithering should be possible for 60Hz input (at least with BFI turned Off but it's Off right now). But I can clearly see obvious static dithering on the very dark shades (from a close distance on homogeneous patterns).
    But don't forget the same material looks fine with the internal player (using the same picture mode and sub-settings), so it's not the panel (or the processing between source and display panel).

    It's a 24fps test video, so the VGA card was set to 12bit Full Range RGB (so no lossy processing should be done by the GPU/driver). But I also tried using 10bit YCC 4:2:2, just in case the TV doesn't like the RGB input so much (or converts between limited/full levels in bad quality). That was even worse (comparable but notably worse). I also tried setting madVR to Limited RGB to "bypass" the limited RGB to the TV inside Full RGB (driver set to Full RGB, TV set to Limited) which also proved futile.
    So, i's not a format incompatibility or internal format/level conversion issue. At least not on this scale. It's something else.

    That's how I came to the conclusion that it's either a bug in the TV firmware which affects the HDMI input but not the internal sources or it's the PC (the nVidia driver, the latest Win10 changes, or madVR when combined with these...?).
     
  15. BetA

    BetA Ancient Guru

    Messages:
    4,526
    Likes Received:
    479
    GPU:
    G1-GTX980@1400Mhz
    this might help.
    For understanding why its happening and what can be done about it, and why sometimes 8bit for movies is better then 10bit.
    Its form the madvr forum, madshi himself..

    https://forum.doom9.org/showpost.php?p=1271418&postcount=4

    https://forum.doom9.org/showpost.php?p=1271416&postcount=2

    It helped me a lot to understand it and setup my system the right way.


    It also might help to watch out for the DRIVERS, some are good, some are plain bad.

    AMD, Intel and Nvidia driver issues and last recommended version
    https://forum.doom9.org/showthread.php?t=176013

    hope that helps, greetz
     

  16. janos666

    janos666 Ancient Guru

    Messages:
    1,648
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    I am pretty sure I read those post ... well, yeah, about 10 years ago.:p
    Did you even notice I am talking about HDR10 videos and not SDR?

    Ah, I wasn't aware of that tread. Thanks. But I knew (from a different source) that HDR metadata should be fine since 430.39. And regardless, the gradient should be banding-free with any metadata.

    So, this is a new kind of driver bug if it is in fact a driver bug. I can't tell just yet, that's why I brought this up here.
     
    Last edited: May 29, 2019
    BetA likes this.
  17. BetA

    BetA Ancient Guru

    Messages:
    4,526
    Likes Received:
    479
    GPU:
    G1-GTX980@1400Mhz
    argh, sorry, i missread then..

    Anyway, i edited my post since there are HDR driver issues, so i linked a thread for it..(that might help)
    Sorry, its late here and i had a looong day, haha..

    AMD, Intel and Nvidia driver issues and last recommended version
    https://forum.doom9.org/showthread.php?t=176013


    Greetz
     
    janos666 likes this.
  18. NeoandGeo

    NeoandGeo Guest

    Messages:
    745
    Likes Received:
    9
    GPU:
    Geforce GTX 970 @1420
    Wow, be careful with the snake oil in that thread. The following:

    • 430.64 2019-05-09 reverts the output dynamic range to Limited when TV is turned off/on need more info and confirmation of other users!

    Does not happen for me, and it looks like just a single user that bungles his or her driver install can make an entire release look unusable. What a crap site.
     
  19. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    What games are using 10 bit textures? None that I know of so it's because of how games implement 8 bit dithering.
     
  20. janos666

    janos666 Ancient Guru

    Messages:
    1,648
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    But they do a lot of processing on those textures. The same object or wall texture can appear in both near-black shadows, bright sunlight or dim red light... A bunch of thing should happen on >=16bit surfaces/buffers.
    And sure, even 10bit output should be dithered rather than rounded/truncated.
    Yet, it's a good point. I never really thought about this if HDR10 and DolbyVision games these days are shipped with >=10bit textures or not.
     

Share This Page