1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HELP: How to create Cusom Resolution with Chroma Subsampling for HDR?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by MegaFalloutFan, Feb 20, 2019.

  1. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    430
    Likes Received:
    37
    GPU:
    RTX 2080Ti 11Gb
    So i been using my 55 inch 4K OLED as my PC monitor for the last 3 years [no i dont have any burn-in]
    To play HDR games its needed to enable 10/12bit Chroma subsampling mode [I dont know about LCD TVs but OLED has 12bit support too], if you wont do it and keep RGB it will do Dithered 8Bit HDR
    I can do it in any supported resolution, but when i create a custom resolution such as 21:9 3840x1600 i only have RGB mode and thats it.
    Basically I want to play games that support HDR like Metro Exodus and in future Tomb Raider in 21:9 mode with HDR, also as a bonus i can enable Ray Tracing without using DLSS and get 60fps on my 2080ti since 3840x1600 is easier then full on 4K native mode.
     
  2. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,118
    Likes Received:
    502
    GPU:
    GTX1080Ti
    It is not going to work on your 8 bit UHD input on your set, period. There is also issues with Windows implementation of HDR on HDMI. Simply put, you have a maximum of 8 bit 4:4:4 UHD @ 4K. HDR requires extra bits, HDR10, requires 10bits.
     
  3. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    430
    Likes Received:
    37
    GPU:
    RTX 2080Ti 11Gb
    Re-read my post and then replay :)
     
  4. KoKlusz

    KoKlusz Active Member

    Messages:
    92
    Likes Received:
    0
    GPU:
    GeForce GTX 970
    Using YCbCr 4:2:2 disables gpu resolution scaling, so you are stuck only with what your display officially supports.

    I believe you used to be able to get around that by creating custom dsr resolution, but it gave uses to much control, so Nvidia broke it.
     

  5. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    430
    Likes Received:
    37
    GPU:
    RTX 2080Ti 11Gb
    Hi,
    I dont need scaling, i need custom resolution, I disable scaling because Im using 55ing OLED as my monitor, so because OLED has no back light I can create any custom resolution like 21:9 and it will just have black border on top and bottom.
    What I need is for my custom resolution 3840x1600 to have 4:2:2/12Bit color so I could enable HDR.
    Right now I only have RGB/8Bit
     
  6. KoKlusz

    KoKlusz Active Member

    Messages:
    92
    Likes Received:
    0
    GPU:
    GeForce GTX 970
    You need gpu scaling to be able to use custom resolutions, even if they don't require any actual scaling. So with 3840x1600, driver either stretches image or adds letterbox, and than sends that image to display as closes supported resolution, in your case 3840x2160. Without gpu scaling 3840x1600 is sent as 3840x1600, which display will reject.

    That said, you should be able to use HDR with RGB 8bit since Windows 1709 update, unless you really want to avoid dithering. But you wont be really gaining much using 4:2:2 12bit vs RGB 8bit dithered.
     
  7. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    430
    Likes Received:
    37
    GPU:
    RTX 2080Ti 11Gb
    Im using a TV, if you ever connected a NVIDIA card to TV [any TV], the 'Use GPU Scaling' option [or whatever its called] is gone.
    In the drop down menu where it says 'Perform Scaling ON', my only option is Display.
    Where it says 'Select a scaling mode', I always choose 'No Scaling' option.

    Thats how it works, i get perfect 21:9 image at the center of my TV.

    Thats how HDR works according to HDR standard, BTW 4K BluRay HDR movies encoded in even lower 4:2:0/10bit [if you ever seen one and checked its metadata] work and encoded, HDR uses 4:2:2 10bit or 12bit [if your TV supports it]
    8 Bit Dithered is not part of HDR standard and looks horrible
     
  8. KoKlusz

    KoKlusz Active Member

    Messages:
    92
    Likes Received:
    0
    GPU:
    GeForce GTX 970
    Okay, that's weird. I got my 970 connected to Samsung KS7000, and not only I have the option for GPU scaling, it's the only way to get any resolutions beyond standard video (2160p/1080p/720p/576i). But that's only working with RGB.

    My guess is that when you use RGB either drivers still are doing some adjustments behind TV's back, or TV itself recognizes that signal is coming from PC and allows non-standard resolutions, but with 4:2:2 it thinks that signal is coming from BD player and trips out. Anyway, have you tried Custom Resolution Utility (CRU)?
    HDR standard has been created with BD players in mind, so it's doesn't exactly applies for consoles/PC. On PC it's actually recommended to always use RGB full whenever possible, since Windows itself always operates in that range, and choosing any other option in control panel causes drivers conversion, which is harmful to image quality. 8 bit with quality dithering (like the one madVR provides) is indistinguishable from 10 or 12 bits. Now, Nvidia drivers are not doing the greatest job in the world, but i wouldn't call it horrible either (although it's possible that OLED makes it more pronouced).

    There's also the fact that not every game that supports HDR supports HDR10 standard, meaning no wide color gamut or bit depth above 8 bits. For example in RE2 i can see (slight) difference in banding between 8 and 10 bit, but Shadow of the Tomb Raider looks identical no matter what bit depth I choose.
     

Share This Page