Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA Drivers Section
Videocards - NVIDIA Drivers Section In this section you can discuss everything ForceWare driver related. ForceWare (Detonator) drivers are for NVIDIA TNT, Quadro and all GeForce based videocards.


Reply
 
Thread Tools Display Modes
So this 4:4:4 problem in a nutshell (HDMI)
Old
  (#1)
Mda400
Master Guru
 
Mda400's Avatar
 
Videocard: GTX 480 SLI 878/1756/2187
Processor: Core 2 Quad Q9550 @ 4 ghz
Mainboard: EVGA 790i SLI FTW PWM
Memory: 8 GB Vengeance 2000mhz
Soundcard: LG LHB326 Receiver (HDMI)
PSU: Ultra X4 1050w (76A 12v)
Default So this 4:4:4 problem in a nutshell (HDMI) - 03-13-2013, 02:45 | posts: 375 | Location: Minnesota

Last night, i got curious and did the EDIDOverride trick to enable 4:4:4 for shets and gaggles of course it works, but with no audio (I have an LG HDMI Blu-ray combo Receiver).

So I have an LG 32LD450 which is an affordable 32" 2010 1080p TV by LG. Also is known as a series from LG that can do 4:4:4 chroma for use with a PC display. You enable this "support" by renaming the HDMI 1 (labeled HDMI/DVI 1 with DVI being the hint) to "PC".

Now this acts as just a "window". It doesn't "create" the 4:4:4 sampling that i thought my PC was already doing fine. My GTX 480 is hooked by the mini-HDMI 1.3 cable that came with the card and i have it set to RGB in the NVCP.

Of course, this does nothing and still the TV thinks the PC is a TV device and goes "OH HEY PSee... i mean TV 4:2:2 source!..." then downgrades the output to 4:2:2. With the EDID override trick, you basically are permanently telling your graphics card that the display is a "monitor" and not a TV. Making a custom resolution under PC doesn't work either.

Now what I THINK is the whole issue with getting 4:4:4 support and audio over an HDMI to HDMI cable (Not DVI/HDMI to HDMI that tricks the display) is that a TV that supports 24hz as a refresh rate (24hz is commonly used in cinematography where 4:2:0/4:2:2 is prominent) will convert all other refresh rates down to 4:2:2 if the GPU reads the 24hz capable TV. With HDMI Extensions, some manufacturers have different ways of structuring their display EDIDs. So those who have the very few LG, Samsung, or Sony displays that support 4:4:4 + audio over HDMI have displays that those manufacturers acknowledged this little flaw.

Knowledged Geforce owners of this problem have told other answer-seeking Geforce owners to submit a hardware bug report to Nvidia about this. This is where I don't think its Nvidia's fault. It's basically the standards that are layed down and how EDIDs vary from display to display. HDMI without a DVI adapter can natively do Full chroma RGB or YCbCr.

So that is my assumption and it sucks if its true, but the REAL question I want to know is that if i go get a DVI/HDMI-HDMI CABLE or DVI/HDMI CONVERTER (with seperate HDMI cable) Does the GTX 400 series and up (since audio is carried over PCI-E and not S/PDIF now) provide LPCM 5.1 over a DVI connector on the card that the Mini-HDMI could?

The NVCP has 3 different fields for Digital Audio (DVI, DVI, HDMI since those are the connections on my card). Currently it says LG-BDHT in the HDMI field since i am hooked to my HDMI receiver with my GPU's Mini-HDMI connection. Again, this does NOT provide the 4:4:4 support that my TV CAN do. This is why i need to go DVI/HDMI and know if LPCM 5.1 is supported over those DVI ports.

THANKS FOR READING THE LOOOOOOONG POST!

Last edited by Mda400; 05-02-2013 at 19:52.
   
Reply With Quote
 
Old
  (#2)
JJayzX
Master Guru
 
JJayzX's Avatar
 
Videocard: Evga GTX 660 FTW Sig2 Sli
Processor: i7 4770k
Mainboard: Gigabyte Z87X-UD3H
Memory: RipjawsX 8gb DDR3 2133
Soundcard: Creative Sound Blaster Z
PSU: Corsair GS700
Default 03-14-2013, 02:13 | posts: 365 | Location: RI, USA

DVI is HDMI minus sound, so unless you can still connect the HDMI cable to get sound you might be out of luck.
   
Reply With Quote
Old
  (#3)
maco07
Member Guru
 
Videocard: 7970 3GB Boost
Processor: FX-8350
Mainboard: M5A88-V EVO
Memory: DDR3 8 GB
Soundcard: Xonar D2X
PSU: Silent Pro 700w
Default 03-14-2013, 17:05 | posts: 94 | Location: Argentina

Quote:
Originally Posted by JJayzX View Post
DVI is HDMI minus sound, so unless you can still connect the HDMI cable to get sound you might be out of luck.
Not always. I used a DVI-HDMI cable in the past and sound works perfect.

MDA400: I had a LG 42LW4500 and have same problem than you. Edid solution was the only way of fixing it, but you loss sound on HDMI.
   
Reply With Quote
Old
  (#4)
JaylumX
Master Guru
 
JaylumX's Avatar
 
Videocard: HD 7950 @ 1150/1600
Processor: i5 760 @ 3.8
Mainboard: ASUS P7P55D-E
Memory: 12GB Corsair 9-9-9-24
Soundcard: Xonar HDAV 1.3/AV40
PSU: Corsair 750W PSU
Default 03-14-2013, 17:27 | posts: 490 | Location: Somewhere Else

Create a custom resolution in the NVCP and make it 1mhz higher - e.g 1920x1080@61hz. That will force the correct colours as well as keep your audio
   
Reply With Quote
 
Old
  (#5)
maco07
Member Guru
 
Videocard: 7970 3GB Boost
Processor: FX-8350
Mainboard: M5A88-V EVO
Memory: DDR3 8 GB
Soundcard: Xonar D2X
PSU: Silent Pro 700w
Default 03-14-2013, 17:29 | posts: 94 | Location: Argentina

Quote:
Originally Posted by JaylumX View Post
Create a custom resolution in the NVCP and make it 1mhz higher - e.g 1920x1080@61hz. That will force the correct colours as well as keep your audio
it works if you create a custom resolution of 0.001 Hz, eg 1920x1080@60.001Hz

Using 61Hz can cause some tearing.
   
Reply With Quote
Old
  (#6)
Mufflore
Ancient Guru
 
Mufflore's Avatar
 
Videocard: KFA2 Anarchy 580@930/4650
Processor: 2500K @ 4.5GHz - blew it!
Mainboard: Gigabyte P67 UD4 B3
Memory: 8G Kngston 2.2GHz CL11 1T
Soundcard: Minimax+ & Dexa Opamps !!
PSU: Corsair Pro AX750
Default 03-14-2013, 17:31 | posts: 9,769 | Location: UK

Quote:
Originally Posted by maco07 View Post
Not always. I used a DVI-HDMI cable in the past and sound works perfect.
Yeah Dual link DVI ports on gfx cards carry digital audio if used for HDMI.
The DVI to HDMI leads you buy these days should link the audio connections, but you can buy older ones that dont, fyi.
   
Reply With Quote
Old
  (#7)
maco07
Member Guru
 
Videocard: 7970 3GB Boost
Processor: FX-8350
Mainboard: M5A88-V EVO
Memory: DDR3 8 GB
Soundcard: Xonar D2X
PSU: Silent Pro 700w
Default 03-14-2013, 17:51 | posts: 94 | Location: Argentina

Quote:
Originally Posted by Mufflore View Post
Yeah Dual link DVI ports on gfx cards carry digital audio if used for HDMI.
The DVI to HDMI leads you buy these days should link the audio connections, but you can buy older ones that dont, fyi.
you are right
   
Reply With Quote
Old
  (#8)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-14-2013, 22:13 | posts: 18

Quote:
Originally Posted by JaylumX View Post
Create a custom resolution in the NVCP and make it 1mhz higher - e.g 1920x1080@61hz. That will force the correct colours as well as keep your audio
Quote:
Originally Posted by maco07 View Post
it works if you create a custom resolution of 0.001 Hz, eg 1920x1080@60.001Hz

Using 61Hz can cause some tearing.
This doesn't necessarily fix the 4:4:4 chroma issues though, as it didn't for me. The only way to enable that was to use my DVI-HDMI cable and do an EDID override (like Mda400 did), by adding the required lines to nv_disp.inf so that the card treats the display as a DVI full RGB monitor instead of an HDMI display, which of course disables the audio signal. I have yet to find a workaround to allow both 4:4:4 and HDMI audio, besides just using your mobo/soundcard audio, which is what I already do.
   
Reply With Quote
Old
  (#9)
Prophet
Master Guru
 
Prophet's Avatar
 
Videocard: Msi 680 Gtx Twin Frozr
Processor: Intel Sb@4.7
Mainboard: Asus P8Z68V Progen3
Memory: 12 Gb Kingston
Soundcard: Asus Essence STX|Akg k701
PSU: Corsair 1200w
Default 03-15-2013, 00:26 | posts: 575 | Location: Heaven

Quote:
Originally Posted by Mda400 View Post
Last night, i got curious and did the EDIDOverride trick to enable 4:4:4 for shets and gaggles of course it works, but with no audio (I have an LG HDMI Blu-ray combo Receiver).

So I have an LG 32LD450 which is an affordable 32" 2010 1080p TV by LG. Also is known as a series from LG that can do 4:4:4 chroma for use with a PC display. You enable this "support" by renaming the HDMI 1 (labeled HDMI/DVI 1 with DVI being the hint) to "PC".

Now this acts as just a "window". It doesn't "create" the 4:4:4 sampling that i thought my PC was already doing fine. My GTX 480 is hooked by the mini-HDMI 1.3 cable that came with the card and i have it set to RGB in the NVCP.

Of course, this does nothing and still the TV thinks the PC is a TV device and goes "OH HEY PSee... i mean TV 4:2:2 source!..." then downgrades the output to 4:2:2. With the EDID override trick, you basically are permanently telling your graphics card that the display is a "monitor" and not a TV. Making a custom resolution under PC doesn't work either.

Now what I THINK is the whole issue with getting 4:4:4 support and audio over an HDMI to HDMI cable (Not DVI/HDMI to HDMI that tricks the display) is that TV manufacturers don't make display profile drivers (the one that makes windows report your display as "LG TV" instead of "Generic PnP monitor") and the graphics driver doesn't know that the display is both capable of sound AND Full Chroma.

Knowledged Geforce owners of this problem have told other answer-seeking Geforce owners to submit a hardware bug report to Nvidia about this. This is where I don't think its Nvidia's fault. It's basically the standards that are layed down and how Windows handles color depth. HDMI without a DVI adapter can natively do Full chroma RGB or YCbCr, but on a PC which is more complex, it needs to be guided with "drivers". These being what i think is the Display profile drivers.

Nvidia gives you the basic color format options (RGB or YCbCr444) in the control panel, but then relies on Windows having an accurate display profile for your TV, which again, most manufacturers do not supply.

So that is my assumption and it sucks if its true, but the REAL question I want to know is that if i go get a DVI/HDMI-HDMI CABLE or DVI/HDMI CONVERTER (with seperate HDMI cable) Does the GTX 400 series and up (since audio is carried over PCI-E and not S/PDIF now) provide LPCM 5.1 over a DVI connector on the card that the Mini-HDMI could?

The NVCP has 3 different fields for Digital Audio (DVI, DVI, HDMI since those are the connections on my card). Currently it says LG-BDHT in the HDMI field since i am hooked to my HDMI receiver with my GPU's Mini-HDMI connection. Again, this does NOT provide the 4:4:4 support that my TV CAN do. This is why i need to go DVI/HDMI and know if LPCM 5.1 is supported over those DVI ports.

THANKS FOR READING THE LOOOOOOONG POST!
How to do the edid trick?
   
Reply With Quote
Old
  (#10)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-15-2013, 04:10 | posts: 18

This is the method that worked for me: http://files.bortweb.com/how_to_fix_...en_or_text.htm

For the Windows 7 drivers, I just added the line under [nv_commonBase_addreg__01] and [nv_commonBase_addreg__02]. Once the drivers were installed and I rebooted, 4:4:4 was working properly.
   
Reply With Quote
 
Old
  (#11)
Prophet
Master Guru
 
Prophet's Avatar
 
Videocard: Msi 680 Gtx Twin Frozr
Processor: Intel Sb@4.7
Mainboard: Asus P8Z68V Progen3
Memory: 12 Gb Kingston
Soundcard: Asus Essence STX|Akg k701
PSU: Corsair 1200w
Default 03-15-2013, 08:48 | posts: 575 | Location: Heaven

Quote:
Originally Posted by Some Dillweed View Post
This is the method that worked for me: http://files.bortweb.com/how_to_fix_...en_or_text.htm

For the Windows 7 drivers, I just added the line under [nv_commonBase_addreg__01] and [nv_commonBase_addreg__02]. Once the drivers were installed and I rebooted, 4:4:4 was working properly.
So it should look ilke this?

I dont know if its actually working, I still got sound.
   
Reply With Quote
Old
  (#12)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-15-2013, 10:57 | posts: 18

Are you using a DVI-HDMI cable? I'm pretty sure using a DVI port on the card's the only way to get it to work. If it's working correctly, it would say "DVI - PC Display" for the connector on the "Change Resolution" screen, and wouldn't give you color options in the desktop color settings. Like this:


   
Reply With Quote
Old
  (#13)
Prophet
Master Guru
 
Prophet's Avatar
 
Videocard: Msi 680 Gtx Twin Frozr
Processor: Intel Sb@4.7
Mainboard: Asus P8Z68V Progen3
Memory: 12 Gb Kingston
Soundcard: Asus Essence STX|Akg k701
PSU: Corsair 1200w
Default 03-15-2013, 12:47 | posts: 575 | Location: Heaven

Aha ok no. Hdmi only. Thanks for your time mat.e
   
Reply With Quote
Old
  (#14)
Mda400
Master Guru
 
Mda400's Avatar
 
Videocard: GTX 480 SLI 878/1756/2187
Processor: Core 2 Quad Q9550 @ 4 ghz
Mainboard: EVGA 790i SLI FTW PWM
Memory: 8 GB Vengeance 2000mhz
Soundcard: LG LHB326 Receiver (HDMI)
PSU: Ultra X4 1050w (76A 12v)
Default 03-16-2013, 10:49 | posts: 375 | Location: Minnesota

So I got that DVI/HDMI adapter and I do get full surround sound to my receiver, but the pixel format is still 4:2:2. So I am stumped. I tried to disable audio and just use the DVI/HDMI adapter as a video connection but it still sees it as HDMI and not DVI.

How the HELL am I suppose to get 4:4:4 from my card without using VGA or the EDID override? Its SUPPOSE to give me 4:4:4 if use the PC label for my HDMI 1 input and use a HDMI/DVI connection which are essentially the same thing, but I've done it with native HDMi and DVi/HDMI adapter connections and the red text is STILL f***in blurry. This also happens with an HDMI laptop with an amd gpu in it. I select 4:4:4 Full RGB PC Standard and my TV displays it as 4:2:2. It works over VGA. Why not HDMI/DVI?
   
Reply With Quote
Old
  (#15)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-16-2013, 16:30 | posts: 18

If you check the 4:4:4 thread on AVS Forum, it seems like it's an issue with the way the TVs handle the signals. From the list of TVs that have been tested by people there, you apparently can't get both 4:4:4 chroma and an audio signal at the same time on the LG TVs, and only one Samsung and a couple of Sony models are listed as being able to pass 4:4:4 and audio together over HDMI-HDMI.
   
Reply With Quote
Old
  (#16)
Xtreme512
Master Guru
 
Xtreme512's Avatar
 
Videocard: ZOTAC GTX 670 2 Gb
Processor: CORE i5 3570K 4.5 GHz
Mainboard: GIGABYTE Z77-D3H
Memory: G.SKILL SNIPER 8 GB CL9
Soundcard: X-FI PLATINUM + G500
PSU: OCZ FATAL1TY 550W
Default 03-16-2013, 21:19 | posts: 460 | Location: Nicosia

how to check if its gives the best color blacks or 4-4-4 dont know what it is ?

by the way Im using DVI-DVI Digital Dual Link cable.
   
Reply With Quote
Old
  (#17)
Mda400
Master Guru
 
Mda400's Avatar
 
Videocard: GTX 480 SLI 878/1756/2187
Processor: Core 2 Quad Q9550 @ 4 ghz
Mainboard: EVGA 790i SLI FTW PWM
Memory: 8 GB Vengeance 2000mhz
Soundcard: LG LHB326 Receiver (HDMI)
PSU: Ultra X4 1050w (76A 12v)
Default 03-16-2013, 21:41 | posts: 375 | Location: Minnesota

alright looks like im using a DVI/VGA adapter to get 4:4:4 over VGA and then using the mini-HDMI to HDMI cable for sound. Thanks for the input guys.
   
Reply With Quote
Old
  (#18)
maco07
Member Guru
 
Videocard: 7970 3GB Boost
Processor: FX-8350
Mainboard: M5A88-V EVO
Memory: DDR3 8 GB
Soundcard: Xonar D2X
PSU: Silent Pro 700w
Default 03-17-2013, 18:36 | posts: 94 | Location: Argentina

Did you tried this? http://forums.guru3d.com/showthread....&highlight=rgb
   
Reply With Quote
Old
  (#19)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-17-2013, 21:22 | posts: 18

Full RGB and 4:4:4 chroma are different things. The workarounds to simply enable Full RGB on Nvidia cards don't necessarily enable 4:4:4. A lot of people need to use DVI-HDMI cables, and others (like Mda400 and I) need to go a step further and use an EDID override to disable the audio signal sent with the video and have the TV read as a DVI-connected display.
   
Reply With Quote
Old
  (#20)
Mda400
Master Guru
 
Mda400's Avatar
 
Videocard: GTX 480 SLI 878/1756/2187
Processor: Core 2 Quad Q9550 @ 4 ghz
Mainboard: EVGA 790i SLI FTW PWM
Memory: 8 GB Vengeance 2000mhz
Soundcard: LG LHB326 Receiver (HDMI)
PSU: Ultra X4 1050w (76A 12v)
Default 03-18-2013, 09:48 | posts: 375 | Location: Minnesota

Quote:
Originally Posted by Some Dillweed View Post
Full RGB and 4:4:4 chroma are different things. The workarounds to simply enable Full RGB on Nvidia cards don't necessarily enable 4:4:4. A lot of people need to use DVI-HDMI cables, and others (like Mda400 and I) need to go a step further and use an EDID override to disable the audio signal sent with the video and have the TV read as a DVI-connected display.
I wish I could use the EDID override trick, but there are two things stopping me.
First, I NEED sound over HDMI because I sold my soundcard that could output a 5.1 (Dolby digital live) signal to my receiver (which also takes CPU cycles to encode) so I have no way of getting sound from my PC other than HDMI.

Second, that EDID override trick causes my TV to incorrectly map the 0-255 full rgb levels being outputted by my PC. The black level setting on my TV causes washout on HIGH even with full range rgb (which is used when you have a device capable of outputting 0-255 and reduces input delay IMMENSELY) and on LOW, the contrast is severely crushed and forces me to correct it with my graphics cards color controls ( which when using a TV you should never have to use).

Last edited by Mda400; 03-18-2013 at 09:51.
   
Reply With Quote
Old
  (#21)
Some Dillweed
Newbie
 
Videocard: GTX 770
Processor: i5 2500K@4.5
Mainboard: P8P67 Pro
Memory: 8GB G.Skill Sniper
Soundcard: Titanium HD
PSU: Corsair AX750
Default 03-18-2013, 21:18 | posts: 18

I know the EDID override doesn't fix your overall issue, but I'm just wondering: how do you know that it's causing washout? What settings were you using after applying the override and did you do some basic calibration tests for proper brightness, contrast, etc.? I'm curious because I only have a model from the next year's line (as I understand it, the LK450 series is pretty similar to the LD450), and haven't exactly noticed what I'd call washout. Black level seems to be as dark as it was, whites didn't get any brighter or more clipped compared to 4:2:2 mode, and it seems like I have proper colours in a lot of cases now. I don't have a calibration device to fix certain gamma and grayscale issues, though.

You might want to try checking out a few threads on AVS Forum, like the 4:4:4 thread: http://www.avsforum.com/t/1381724/of...ampling-thread. And, I'm not sure what help he's actually offering, but there's some guy named Tulli in an ATi/AMD EDID override thread on AVS who seems to be helping people with HTPC and receiver issues, including some with Nvidia cards. You might want to give that thread a look and maybe ask there for help: http://www.avsforum.com/t/1091403/edid-override-thread.
   
Reply With Quote
Old
  (#22)
Mufflore
Ancient Guru
 
Mufflore's Avatar
 
Videocard: KFA2 Anarchy 580@930/4650
Processor: 2500K @ 4.5GHz - blew it!
Mainboard: Gigabyte P67 UD4 B3
Memory: 8G Kngston 2.2GHz CL11 1T
Soundcard: Minimax+ & Dexa Opamps !!
PSU: Corsair Pro AX750
Default 03-18-2013, 21:23 | posts: 9,769 | Location: UK

If you can set the black level correctly to 0-255 RGB (if black was previously set for 0 - 16 = black), it will result in a much brighter picture.
To counter this, turn brightness down, a lot if necessary.
   
Reply With Quote
Old
  (#23)
Mda400
Master Guru
 
Mda400's Avatar
 
Videocard: GTX 480 SLI 878/1756/2187
Processor: Core 2 Quad Q9550 @ 4 ghz
Mainboard: EVGA 790i SLI FTW PWM
Memory: 8 GB Vengeance 2000mhz
Soundcard: LG LHB326 Receiver (HDMI)
PSU: Ultra X4 1050w (76A 12v)
Default 03-19-2013, 13:47 | posts: 375 | Location: Minnesota

Quote:
Originally Posted by Some Dillweed View Post
I know the EDID override doesn't fix your overall issue, but I'm just wondering: how do you know that it's causing washout? What settings were you using after applying the override and did you do some basic calibration tests for proper brightness, contrast, etc.? I'm curious because I only have a model from the next year's line (as I understand it, the LK450 series is pretty similar to the LD450), and haven't exactly noticed what I'd call washout. Black level seems to be as dark as it was, whites didn't get any brighter or more clipped compared to 4:2:2 mode, and it seems like I have proper colours in a lot of cases now. I don't have a calibration device to fix certain gamma and grayscale issues, though.

You might want to try checking out a few threads on AVS Forum, like the 4:4:4 thread: http://www.avsforum.com/t/1381724/of...ampling-thread. And, I'm not sure what help he's actually offering, but there's some guy named Tulli in an ATi/AMD EDID override thread on AVS who seems to be helping people with HTPC and receiver issues, including some with Nvidia cards. You might want to give that thread a look and maybe ask there for help: http://www.avsforum.com/t/1091403/edid-override-thread.
I calibrate my HDTV using the lagom.nl/lcd-test/ images. When i used the EDID trick, with my existing calibration when under HDMI, I go to the Black Level test on that site and the first black "step" is VERY visible (which it should not be) with the background around it is gray like i'm using 16-235. I even disable that tool that adds registry keys to enable full RGB by default and since those are for HDMI, i know for sure i am set in 0-255. But again it's incorrectly mapped and i have to switch to HDMI black level LOW to get a considerable contrast ratio and then calibrate using my graphics card's color controls to get it perfect which i never had to do before applying the override.

You DON'T want use HDMI Black Level low with a PC connection. Black Level Low is for enhancing 16-235 content from a source like a blu-ray player. Having the TV "enhance" the extra 0-16 and 235-255 levels when using any type of computer causes input delay. Its just that not all consumer computer devices allow the option to switch from a limited 16-235 signal to a full 0-255 signal over HDMI and most HDTV's do not offer a black level option. In that case, the TV is expecting 16-235 limited by default. The latest Apple TV is one of the few modern consumer devices to offer such an option, but so do current gen. game consoles.

Quote:
Originally Posted by Mufflore View Post
If you can set the black level correctly to 0-255 RGB (if black was previously set for 0 - 16 = black), it will result in a much brighter picture.
To counter this, turn brightness down, a lot if necessary.
It was set for 0-255 before and I know this because I used the NVfullrange_Toggle (or whatever its called) tool that someone posted on this forum, to enable it by default on startup. Turning down the brightness reduces contrast if set too far from the perfect calibrated point. So if i were to turn down brightness with my current calibration, under the override it would be like me using 16-235 again and reducing contrast from an already incorrectly mapped 0-255 signal.

In summary, there's that 0-255 window by my TV, but gets treated like 16-235 from the PC that is reporting a DVI connection. This is what I expect though from an override. You're forcing a format that potentially could be incompatible with your TV.

Last edited by Mda400; 03-19-2013 at 14:06.
   
Reply With Quote
Old
  (#24)
Mufflore
Ancient Guru
 
Mufflore's Avatar
 
Videocard: KFA2 Anarchy 580@930/4650
Processor: 2500K @ 4.5GHz - blew it!
Mainboard: Gigabyte P67 UD4 B3
Memory: 8G Kngston 2.2GHz CL11 1T
Soundcard: Minimax+ & Dexa Opamps !!
PSU: Corsair Pro AX750
Default 03-19-2013, 17:31 | posts: 9,769 | Location: UK

Quote:
Originally Posted by Mda400 View Post
It was set for 0-255 before and I know this because I used the NVfullrange_Toggle (or whatever its called) tool that someone posted on this forum, to enable it by default on startup. Turning down the brightness reduces contrast if set too far from the perfect calibrated point. So if i were to turn down brightness with my current calibration, under the override it would be like me using 16-235 again and reducing contrast from an already incorrectly mapped 0-255 signal.
No it isnt.
Turn the contrast up, thats what the control is for, in case you need more.
   
Reply With Quote
Old
  (#25)
maco07
Member Guru
 
Videocard: 7970 3GB Boost
Processor: FX-8350
Mainboard: M5A88-V EVO
Memory: DDR3 8 GB
Soundcard: Xonar D2X
PSU: Silent Pro 700w
Default 03-19-2013, 18:49 | posts: 94 | Location: Argentina

How to check if your display are showing Full RGB 4:4:4?

If you see Magenta Word blurry, then you have not FULL RGB working (check it on a notebook display to see it correctly)

http://img.photobucket.com/albums/v2...g?t=1287963155

And, for those who have doubts about washed out colors when change space color, take in count this:

You have to set your TV

HDMI Level Low: For inputs with 15-235 space color
HDMI Level Normal: For inputs with 0-255 space color
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.