So this 4:4:4 problem in a nutshell (HDMI)

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Mda400, Mar 13, 2013.

  1. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    In summary, what I meant is that I calibrate for a PC and I am content with what the Xbox 360 looks like after calibrating with a PC in mind. This is because sharpness will be pixel-perfect (to a TRUE 1080p image. Not an upscaled 720p one), Contrast and Brightness will be correct (unlike the Xbox 360 which flags improper RGB levels over HDMI as of Dec. 4th, 2011), and color saturation as high without bleeding (color bars to calibrate this with).

    Some TV's gamma setting is not an actual gamma correction feature, but is an "enhancement" just like Noise Reduction and Dynamic Contrast. LG's and Sony's are an example of those respective TV's. On Sony's, you can actually turn OFF gamma (which makes it clear its an enhancement and not a correction feature). On an LG, there is no off, but the Input delay is worse as you use 2.2 and 2.4. The LOW/1.9 gamma setting yields the lowest input lag while still being able to properly calibrate white saturation and black level (it also yields the best contrast ratio on my screen once i turn down the backlight of course...).

    It was not high because once I switched to LOW black level, it still wasn't contrasted enough to acceptable range. Again, the EDID override basically makes it so the GPU disregards what your display supports and forces the full output for that type of cable. HDMI is basically DVI with audio so it disables audio and forces full 4:4:4 just like a normal DVI connection would be. But in doing so, it incorrectly maps it with my display since it doesn't get told by the GPU what to do with the signal anymore.

    You can't. 60hz is where PC-generated content is usually common (Consoles fall into the PC category too) and since the PC is the only electronics device that can come remotely close for the bandwidth and computing power to output 4:4:4 in real-time, you won't see it for a device that is mastered for displaying content below that refresh rate.

    For example, 24hz is commonly used in what? Movies/Video.

    The cinematic standard we all have seen for decades in theaters and on DVD/Blu-Ray is where that refresh rate is common. That is why when blu-ray player's even GIVE the option between 24hz and 60hz, it is for you to choose between a better representation of color (4:2:0 from blu-ray/dvd in an accurate 4:4:4 window) or smooth video playback with a 4:2:0 source in a 4:2:2 window (though not as accurate and sharp as 4:4:4, its hardly noticeable so use 24hz and be happy with the smooth video).

    So that is why your TV only switches to "PC mode" when it detects a 60hz (50hz for PAL regions) refresh rate being sent to it.
     
    Last edited: Apr 8, 2013
  2. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,282
    Likes Received:
    210
    GPU:
    MSI RTX 2080
    Same here. Which is why I prefer cinematic games at 30FPS. It's not 24FPS. But it's still better than 60FPS for cinematic games.



    ALSO: All Video(Film,anime,TV,etc) content is 4:2:0 or 4:2:2 anyway. So you shouldn't be losing anything by doing 24Hz mode. Because most of the loss is already there. And when you watch stuff on your BD Player it's probably using the same 4:2:2 mode to display that kind of content
     
    Last edited: Apr 8, 2013
  3. flamey

    flamey New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    Msi GTX570
    Indeed, bit of a trade-off is required until someone actually releases a TV capable of 4:4:4@24hz. There is some benefit in chroma upscaling to 4:4:4 with a good algorithm (as blu-ray's upscale to 4:2:2 anyway) here is a good example however the trade off is using 60hz and therefore enabling smooth motion in madvr.
     
  4. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    I have just submitted a Driver feedback report on Nvidia's site about this issue with not allowing 4:4:4 video and LPCM audio at the same time over HDMI, so hopefully this problem gets cleared up in the future.

    I just don't see why they haven't supported it from the start or have done nothing to fix it. On another forum, they said Nvidia didn't see this chroma subsampling issue as a "problem". But how can you expect to limit a 4:4:4 capable interface such as HDMI (which has bandwidth for Audio at the same time) and not get flak for it?
     

  5. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    I can send audio and 4:4:4 video through HDMI without issue, but I know this is because my LCD support it, I used to had a LG 42LW4500 and don't support audio and 4:4:4 at all. Isn't nvidia 100% fault I think.
     
  6. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,282
    Likes Received:
    210
    GPU:
    MSI RTX 2080
    Yes it's half TV manufacturer's fault with the way they design their TVs and EDIDs
     
  7. flamey

    flamey New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    Msi GTX570
    Same as maco07, I have both 4:4:4 and audio over HDMI however I have one of the few displays which support this without any hacks (its a samsung).

    I assume using the hacks to get around the EDID creates a issue with the audio.
     
  8. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,562
    Likes Received:
    1,629
    GPU:
    ROG Strix 1080 OC
    Is this what you're talking about ?

    [​IMG]

    Never really looked into what it was about before.
     
    Last edited: Apr 18, 2013
  9. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    Unlike some, I'm not hooking directly through the TV. I'm going from PC to HDMI receiver then to TV. In this case, the EDID of the receiver is being read and takes the audio portion and the TV receives the video portion via passthrough (receiver does NOT touch the video stream at all). I have tried it with just pc to HDTV and I still just get 4:2:2 with audio so its not the receiver's limitation of passing 4:4:4 with audio.

    Its possible for them to implement a way to force off reading from the EDID just like the Xbox 360 does with its Display Discovery setting (even though it still reads SOME portion of the EDID always).

    I still think this is a driver issue BECAUSE If I uninstall ALL Nvidia High Definition Audio drivers (4 of them) and prevent the generic Microsoft ones from being installed, the output is still read as HDMI and limits it to 4:2:2. The EDID override forces your Display's EDID string into a driver installation so THEN the GPU knows EXACTLY what its dealing with.

    Doing so however, disables audio because there's no portion of the video driver's .inf to control audio bits and thats provided with the separate HDMI driver.

    This is not an issue for them like how they had to lock overvoltage for GTX 600+ users because it costs them a lot of money to replace. This is about giving full user control for displaying content.

    If you set an unsupported resolution on a digital display, what happens? it just doesn't work because its either bit ON or bit OFF in the logical world. If bits don't match the source, they won't flip and no harm no foul.

    With analog it could be different, but setting an unsupported resolution on a CRT just makes it show an "out of range" dialog box. Since dealing with analog has a risk of causing voltage spikes, THAT could be dangerous, but even VGA reads an EDID or DDC/CI (whatever the control for the VGA interface is) and the GPU sets itself correctly over VGA, for the display its connected to.

    In summary, this is why I am requesting for them to give a simple on/off setting (kinda where the audio option for HDMI audio was) under "Change Resolution" in the NV control panel. Its totally do-able because the Xbox 360 does it perfectly (even though i'm not sure if it outputs 4:4:4) by letting users disable Display Discovery and NOTES that if you disable Display Discovery and then choose settings that are not supported by your display, output quality may be degraded (meaning doesn't show a picture or 2 channel audio instead of 5.1 Dolby Digital). Only make it available under HDMI too, since that's what the Xbox 360 does and I don't see anywhere that this will harm any components...
     
    Last edited: Apr 18, 2013
  10. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    (continued)

    If my TV supports 4:4:4 over VGA ( i just tried it yesterday), it supports it over my HDMI 1/DVI port by renaming it to "PC".

    If I run Moninfo and look at my Receiver's real-time EDID reading, if it supports YCbCr 4:4:4, it'll support RGB 4:4:4 and also shows it supporting 6 channel LPCM up to 192khz.

    So what I'm saying guys/girls is that if they allowed for the simple option of not reading the entire EDID (since it sounds like it varies from manufacturer of how its built), it would allow us powerusers to set it accordingly since I always know exactly what my components support and EVERYONE should when looking for a new electronic...



    YES this is mostly what I'm talking about (I have an AMD GPU-enabled laptop with HDMI), but I have tried it with my Laptop and I'm still limited to 4:2:2 when tested. Reading that EDID messes it all up for both GPU manufacturers. create an option to disable reading certain portions of the EDID (exactly like what the Xbox 360 does) and let us more knowledged users configure it for ourselves.

    For the inexperienced that wouldn't know what the feature does, I'd hope they'd have the sense to not touch it (THE BIG RED BUTTON) even though it wouldn't harm their displays.

    New computer users would just being greeted with a black screen or 4:4:4 into a 4:2:2 display (loss of color info anyways) and would just have to wait and it would revert back to original settings.
     
    Last edited: Apr 18, 2013

  11. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,562
    Likes Received:
    1,629
    GPU:
    ROG Strix 1080 OC
    I've got ITC processing enabled if that makes any difference
     
  12. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    I'll look into that tonight. I don't think I've ever seen that feature in the control panel, but if its there and fixes the issue for my laptop, then its Nvidia's issue.

    Right now, I got the Mini HDMI-to-HDMI cable going to my TV, which has the EDID override enabled for it (to achieve 4:4:4).

    THEN, I got a separate DVI/HDMI connector with an HDMI cable going to my HDMI receiver (for 5.1 LPCM audio) because the EDID override is only enabled over the HDMI port and allows me to turn on audio for the DVI ports separately.

    This "extra" workaround, gets me 4:4:4 and audio over cleaner digital connections (before I was using VGA to get 4:4:4 and using the Mini HDMI-to-HDMI to to my receiver for audio, which VGA with clock and phase set, still looks like crap on a flat-panel). But I still want to get to the bottom of this problem of using a single HDMI connection.
     
    Last edited: Apr 19, 2013
  13. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,562
    Likes Received:
    1,629
    GPU:
    ROG Strix 1080 OC
    If it helps, this is the location, obviously only available when the HDMI display is enabled (I use my HDMI TV as my extended display, DVI for normal monitor)


    [​IMG]
     
  14. Supertribble

    Supertribble Master Guru

    Messages:
    869
    Likes Received:
    118
    GPU:
    1080Ti/RadVII/2070S
    I wish those options were available in Nvidia driver. :bang:
     
  15. elMARIACHI

    elMARIACHI Member

    Messages:
    30
    Likes Received:
    0
    GPU:
    3x Sapphire R290x
    After succesfully completing the EDID Override, I somewhat seem to have got this working. I've been referring to the tint-blue-rgb calibration picture to see if 4:4:4 is indeed functioning properly and finally the text "RED" and "MAGENTA" are now clearly visible just like my DVI monitor. However, I had to create a custom resolution of 60.550 in order for this to take effect which then results in a fuzzy screen. 60.550 is the lowest setting i can choose to properly display 4:4:4 while not exceeding my display's maximum pixel clock of 150 Mhz. 61 Hz as an example surpasses this ceiling. My picture quality looks much worst now then it did before. I also applied the 0-255 fix and spent quite a bit of time playing with picture settings such as Contrast/Brightness/Sharpness to see if somehow it would fix this problem. I've now spent days trying to get my 42" Vizio VU42L picture perfect. Does anyone have any advice as to what I can do from this point...besides of course getting a new TV ;-p ?
     

  16. maco07

    maco07 Active Member

    Messages:
    96
    Likes Received:
    0
    GPU:
    7970 3GB Boost
    1) Turn your HDMI Level to "NORMAL" (LOW by default). This indicate to your TV that signal is 0-255 instead of 15-235

    2) Calibrate white level to 6500K (warm2 it's the closest on most TV's)
    3) Calibrate Brightness, Contrast and Sharpness using Lagom Site: http://www.lagom.nl/lcd-test/
    4) Enjoy
     
  17. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    It sounds like by using a refresh rate that is higher than 60hz, you are getting a garbled sharpness to your screen. So I would not use that override trick to force 4:4:4 (it has happened to me using a DVI cable on an actually monitor just for ****s and giggles).

    Go here for the proper ways to override your EDID: LCD Televisions with 4:4:4

    After either of those tricks to override your EDID, just restart and it should work without having to make a custom refresh rate...



    Also, to everyone else, I think the actual reason graphics cards downgrade the chroma output to 4:2:2 is because the display being hooked up supports 24hz.

    24hz is only used with movie material for that cinematic motion feel and sources like Blu-Ray or DVD only have a chroma value of 4:2:0. The current HDMI revisions cannot send native 4:2:0 chroma, so 4:2:2 is the next best format and thus, everything else is downgraded to 4:2:2 to become compatible with the 24hz option.

    Once I did the EDID override and my TV is shown as a DVI display, I no longer have any other refresh rate but 60hz and that's where 4:4:4 works and TV's that only go into "PC mode" at 60hz signals. So the newer TV's that can accept 4:4:4 and Audio simultaneously over HDMI, they most likely have an updated way of distinguishing PC resolutions/refresh rates (DVI) and consumer electronics resolutions/refresh rates (HDMI with possibilities like 24hz and naming like 720p instead of 1280x720) in their EDID.
     
    Last edited: Apr 22, 2013
  18. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,562
    Likes Received:
    1,629
    GPU:
    ROG Strix 1080 OC
    That site says that my 16:9 1920x1080 LCD is unusual rez and aspect ratio for a monitor ?! How old is that site ?
     
  19. Mda400

    Mda400 Master Guru

    Messages:
    971
    Likes Received:
    129
    GPU:
    3080Ti 1950/21.4G
    Has not been updated since 2008. Don't worry. If you ctrl+mouse wheel, zoom in until it says 1280x720 and it will say something about the format being an HDTV (not PC monitor) standard. So it can distinguish older "standards" for HDTV resolutions, but not the "Full HD" standard of today's TV's.
     
  20. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,562
    Likes Received:
    1,629
    GPU:
    ROG Strix 1080 OC
    Ah ok, cheers :)
     

Share This Page