So this 4:4:4 problem in a nutshell (HDMI)

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Mda400, Mar 13, 2013.

  1. Some Dillweed

    Some Dillweed Member

    Messages:
    24
    Likes Received:
    1
    GPU:
    RTX 3080 Ti
    I know the EDID override doesn't fix your overall issue, but I'm just wondering: how do you know that it's causing washout? What settings were you using after applying the override and did you do some basic calibration tests for proper brightness, contrast, etc.? I'm curious because I only have a model from the next year's line (as I understand it, the LK450 series is pretty similar to the LD450), and haven't exactly noticed what I'd call washout. Black level seems to be as dark as it was, whites didn't get any brighter or more clipped compared to 4:2:2 mode, and it seems like I have proper colours in a lot of cases now. I don't have a calibration device to fix certain gamma and grayscale issues, though.

    You might want to try checking out a few threads on AVS Forum, like the 4:4:4 thread: http://www.avsforum.com/t/1381724/official-4-4-4-chroma-subsampling-thread. And, I'm not sure what help he's actually offering, but there's some guy named Tulli in an ATi/AMD EDID override thread on AVS who seems to be helping people with HTPC and receiver issues, including some with Nvidia cards. You might want to give that thread a look and maybe ask there for help: http://www.avsforum.com/t/1091403/edid-override-thread.
     
  2. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,672
    GPU:
    Aorus 3090 Xtreme
    If you can set the black level correctly to 0-255 RGB (if black was previously set for 0 - 16 = black), it will result in a much brighter picture.
    To counter this, turn brightness down, a lot if necessary.
     
  3. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    I calibrate my HDTV using the lagom.nl/lcd-test/ images. When i used the EDID trick, with my existing calibration when under HDMI, I go to the Black Level test on that site and the first black "step" is VERY visible (which it should not be) with the background around it is gray like i'm using 16-235. I even disable that tool that adds registry keys to enable full RGB by default and since those are for HDMI, i know for sure i am set in 0-255. But again it's incorrectly mapped and i have to switch to HDMI black level LOW to get a considerable contrast ratio and then calibrate using my graphics card's color controls to get it perfect which i never had to do before applying the override.

    You DON'T want use HDMI Black Level low with a PC connection. Black Level Low is for enhancing 16-235 content from a source like a blu-ray player. Having the TV "enhance" the extra 0-16 and 235-255 levels when using any type of computer causes input delay. Its just that not all consumer computer devices allow the option to switch from a limited 16-235 signal to a full 0-255 signal over HDMI and most HDTV's do not offer a black level option. In that case, the TV is expecting 16-235 limited by default. The latest Apple TV is one of the few modern consumer devices to offer such an option, but so do current gen. game consoles.

    It was set for 0-255 before and I know this because I used the NVfullrange_Toggle (or whatever its called) tool that someone posted on this forum, to enable it by default on startup. Turning down the brightness reduces contrast if set too far from the perfect calibrated point. So if i were to turn down brightness with my current calibration, under the override it would be like me using 16-235 again and reducing contrast from an already incorrectly mapped 0-255 signal.

    In summary, there's that 0-255 window by my TV, but gets treated like 16-235 from the PC that is reporting a DVI connection. This is what I expect though from an override. You're forcing a format that potentially could be incompatible with your TV.
     
    Last edited: Mar 19, 2013
  4. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,672
    GPU:
    Aorus 3090 Xtreme
    No it isnt.
    Turn the contrast up, thats what the control is for, in case you need more.
     

  5. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    How to check if your display are showing Full RGB 4:4:4?

    If you see Magenta Word blurry, then you have not FULL RGB working (check it on a notebook display to see it correctly)

    http://img.photobucket.com/albums/v293/nuker43/inputlag/TintBlueRGB.png?t=1287963155

    And, for those who have doubts about washed out colors when change space color, take in count this:

    You have to set your TV

    HDMI Level Low: For inputs with 15-235 space color
    HDMI Level Normal: For inputs with 0-255 space color
     
  6. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    OK... WHAT I'M TELLING YOU is that using the EDID override, HDMI Black Level HIGH on my TV, causes the already-full rgb 0-255 DVI signal to be washed out/incorrectly mapped to my TV.

    If i turn the contrast up and brightness down to balance it, it's not nearly as bright as without the EDID override. Turning to LOW HDMI Black Level can fix the contrast issue, but requires me to change not only TV color controls, but use Graphics card color controls at the same time to balance out the crushed blacks and bleeding whites.

    And like i said before, HDMI Black Level LOW causes input delay because your TV is processing the 0-16 and 235-255 levels for limited 16-235 VIDEO content. PC content is 0-255 which is correctly set for HDMI Black Level HIGH. In this case, I cannot use HIGH because the already 0-255 signal is being washed out at HIGH (which does NOT happen without the override).

    To get optimal contrast because my TV's 0-100 contrast slider cranked to 100 on Black Level HIGH is still not "bright" enough under the override, switching to LOW is a good start, but i cause input delay and also me having to unnecessarily fiddle with NVCP color controls. I do not WANT to use the NVCP controls BECAUSE I play my Xbox 360 on the same HDMI port (through a receiver) so I have to universally calibrate for both sources using my TV instead and not relying on the NVCP controls for PC use.

    I am simply stating this for others that try this override and find the same distractions.
     
    Last edited: Mar 19, 2013
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,672
    GPU:
    Aorus 3090 Xtreme
    This is supposed to happen, it does the same on my TV.
    Its possible something odd is happening, but most likely it is not incorrectly mapped, its 2 things:
    1) Black is now too bright and needs turning down.
    2) You arent used to seeing the full complement of colours.


    Once you have made the change use this to do a basic display calibration:
    http://www.lagom.nl/lcd-test/
    Go through each page and make sure it is working as described.
     
  8. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Ok so you really haven't understood the last post I just typed...

    I understand the 2 details you gave in the above, but I cannot do them to compensate the override issue. Its for reasons of how my current calibration affects other things. This would be the calibration that i have made WITHOUT the EDID override and is the correct calibration to use with my Xbox 360 as well.

    A casual gamer would not care about input lag and using Black Level LOW, but I am not a casual gamer and I aim to squeeze every bit of performance I can out of my hardware. This would include the display so even though "can" switch to Black Level LOW and solve the override issue, I WILL NOT because it induces a large amount of input lag that I can detect. But yes, I use Lagom.nl for calibrating my TV.
     
    Last edited: Mar 19, 2013
  9. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    Wich exact brand and model your display are?
     
  10. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    LG 32LD450. 32" 1080p 60hz CCFL LCD (2010 model).
     

  11. advil000

    advil000 Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    Gigabyte 970 G1 Gaming
    I would like to post my exact experiences as to what was necessary to get my 47" Vizio 47XVT to accept 4:4:4 with an Nvidia 670 2GB card.

    There were THREE distinct steps necessary to finally get this working.

    1) A DVI to HDMI Cable. I had to use the DVI out on my 670 not the HDMI.

    2) The EDID override. You There are two methods commonly posted, one is to modify the nv_disp.inf and the other is a registry change that doesn't require reinstalling the video driver. In my case with Windows 8, the REGISTRY override was the ONLY one that worked. And it worked perfectly. The TV now detects as a generic DVI monitor via the DVI to HDMI cable. You can verify this in the NV control panel. If you don't get the TV detecting as a DVI device, it's not going to work in 4:4:4.

    3) Creating a custom resolution in the NV Control Panel. I simply created a custom 1920x1080 res at 61hz (so it doesn't conflict or get confused with the default 1920x1080 @ 60hz).

    I have to admit, the real icing on the cake was having to create that custom resolution. It means that Nvidia's default resolution choices are somehow internally linked to outputting at 4:2:2. Which makes no sense at all.

    For anyone wondering how much difference this makes? It is a NIGHT AND DAY quality difference when viewing text on your desktop. I used the "quick and dirty" test image and it was absolutely obvious when the red and magenta text suddenly became crystal clear just like my other Dell DVI monitor next to my TV. The difference in all uses is DRAMATIC. I used to think that sharpness setting 3 on my TV was "neutral" and below that was negative. Wrong. There is now only a TINY VISUAL DIFFERENCE between sharpness 0 and 10 when viewing text. It's that much better.

    This is a major, major issue when using a TV as a monitor. I wish I had known how to do this for the last 4 years I've owned this set, but with both the TV makers not writing EDID correctly and the video card makers content to force 4:2:2 output all the time if it even suspects the display device is a TV... we are all forced to live with quality substantially below what all of our TVs are capable of displaying. INSANE.
     
    Last edited: Mar 22, 2013
  12. Some Dillweed

    Some Dillweed Member

    Messages:
    24
    Likes Received:
    1
    GPU:
    RTX 3080 Ti
    Huh. Weird. All I had to do for my 42LK450 were the first two (using nv_disp), with no custom resolution required.
     
  13. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Well 4:4:4 is able to be shown over a Native HDMI connection and even higher bits per component due to Deep Color and xvYCC.

    The problem is how "traditional" PC connections work with drivers. DVI and VGA were made for the PC platform. HDMI is basically a bridge between at home personal electronics (DVD, Blu-Ray, Cable STB, etc.), but DisplayPort is suppose to be for the PC platform as HDMI is the same thing, but for the non-PC (meaning with a desktop essentially) electronics.

    So DisplayPort "compliments" HDMI for this reason. Even though both can do the same audio and 4:4:4, They limit them to their respective markets.
     
  14. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,464
    Likes Received:
    91
    GPU:
    Nvidia RTX 4080 FE
    This link is dead... are there instructions on how to do this anywhere else?? My monitor is HDMI only and using DVI->HDMI cable.

    I'm stuck with RGB which looks horrible or YCbCr444.
     
  15. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    Link is working, try again!
     

  16. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    You can achieve same result doing this:

     
  17. pagusas

    pagusas Guest

    Messages:
    110
    Likes Received:
    0
    GPU:
    MSI GTX570
    Another users that post on neogaf and here I believe made this: http://blog.metaclassofnil.com/?p=83

    It fixed my issues with the drivers outputting at a 4:2:2 and limited brightness (16-235).

    Its a toggle so you dont have to do any EDID registry work, it'll do it for you. But I know your issue is also about audio and I havnt a clue.
     
  18. retiredat44

    retiredat44 Master Guru

    Messages:
    337
    Likes Received:
    11
    GPU:
    Nvidia 2080-ti
    I use Nvidia PNY GTX460 with HDMI to my Samsung 46" LCD HDTV and it automatically sets everything. Everything works fine in and out of games too.

    FYI
     
  19. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    This is not about the full range issue (as i use that utility for that issue and it works perfectly), but about how each pixel in your monitor has full bits for each color (4:4:4). Not half resolution for the red subpixel and full res. for the blue and green subpixels (4:2:2)

    HDMI is capable of 4:4:4 and SHOULD be capable of doing it with audio (has enough bandwidth in the specification), but video cards treat HDMI as an HDTV 16-235 limited range and 4:2:2 by default because that's what most TV's are capable of right now. All HDTV Flat Panel TV's should be capable of Full Range RGB over HDMI. But for 4:4:4, not unless you use VGA, apply the EDID override to treat HDMI as DVI (HDMI without audio), or have a TV that can do both 4:4:4 and audio at the same time (only some Sony's and Samsungs).
     
    Last edited: Apr 1, 2013
  20. TheSarge

    TheSarge Guest

    Messages:
    812
    Likes Received:
    17
    GPU:
    EVGA RTX 3080 TI FT
    Weather you can output sound through a DVI-to-HDMI path depends on if the video card is being fed audio or not. Even as far back as the old GTX 285 cards, there was a way to feed the card an audio signal. In the 285's case, despite the fact that there was no native HDMI port on the card, the card did have a a 2-pin SPIDIF audio-in port on the top edge of the card. Nowadays, the HDMI audio signal rides to the card on a glorious, glorious PCIe lane while advertisers and HDMI evangelists sing it's praises, but back in the day the lowly 2-pin SPIDIF cable did the job just as well.

    Not that I ever use HDMI audio. I run my audio out through the optical SPIDIF(TOSLINK) line.
     

Share This Page