Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Astyanax, May 20, 2020.
Okay so a few more months then, i guess i'll just wait.
Stuttering in Assassins Creed 4 with PhysX enabled
no, this renders into the application's presentation, not on top of it.
There might be a misunderstanding of what we all mean. HDR definitely doesn't get disabled for me from overlays or on-window controls, when I play an HDR game.
In games that have no HDR but I have Windows HDR enabled, then Windows temporarily switches to HDR while the controls are on, and then switches back when the controls go away. An example of this is The Witcher 3. Nothing really "breaks", it's just arranged in a dumb way.
That something is available does not change that fact it is an unreleased BETA OS. Surely we can at least stick with facts. It has not left the insider ring any more than the public being able to get the fast ring means its not beta and has 'left the insider ring'. It's still a preview. You must be bored to invent an argument.
Setting it to 4:4:2 looks utterly awful on the desktop. You ARE having issues with it because you have to use NONSTANDARD settings to hack getting HDR to somewhat work. It's a mess. on Windows.
i installed 450.99 and my laptop have g-sync display but after updating
- Variable refresh rate option in display settings is not there and lacking
- notification center brightness slider is missing
- setup g-sync in nvidia control panel is missing too
is this normal ?
Can anyone else try a 4K60 HDR video on youtube and examine your mouse pointer when the youtube play bar hides. do you have a square overlay around your mouse pointer?
My only fix was to roll back to 442.19 just to be sure. and it fixed the mouse overlay square problem on HDR youtube videos.
I have this with with an LG 27UK650 + RTX 2080 Ti.
It happens only while HDR is active, SDR is fine.
It has left the Insider ring if people who aren't part of the Insider program can download it through Windows Update. https://fossbytes.com/install-windows-10-2004-may-2020-update/
There was never an argument, you must be bored to turn it into one.
Yes, someone else sees it !!
if any nvidia devs are watching, there is a bug for you, maybe chrome/nvidia drivers/win 10/hdr/ vp9 combo?
yup this occurs for me as well noticed on the weekend. I hadn't watched a HDR video in a while on youtube and I just blamed chrome. Interesting
I'm suspicious of this being a Windows issue atm, there are pre-existing reports of lines and boxes on hdr content that disappear with the sdr slider on the feedback hub.
That depends on the TV, I guess. I don't set that for my monitor, but for a 65" OLED TV.
There is nothing "non standard" about YCbCR 4:2:2 12 bit. In fact, Dolby Vision only works with 12bit color, which you cannot get with any other standard over HDMI. Televisions will usually process everything internally in 4:2:2. There is zero banding with these settings, and they give full color information for both SDR and HDR.
I cannot see what any other operating system would do over HDMI. The PS4 Pro does the same, it sets it at 4:2:2 12bit, Limited.
It's an old standard indeed but this chroma sub-sampling technique dates back to the very first color televisions (those small CRT boxes which replaced the "black and white" [actually grayscale] CRT and they had to deal with analog bandwidth [like real Hz width in frequency spectrum in air]).
There are better compression techniques now (like DSC in DP 1.4 and HDMI 2.1) and also far Far FAR superior interface options. For example, copper HDMI cables could have been replaced with optical, or hybrid (optical for the main one-way video stream and copper for two-way AUX data like ethernet, USB, CEC, eARC, etc). I would personally vote for duplex LC-LC fiber, reusing one the most popular connectors from servers. Those optical cables are cheaper than an average HDMI cable (let alone on the long runs where copper gets really expensive because fairly cheap copper simply stops working).
Or even both DP and HDMI could be replaced with ThunderBolt (which already has standards for both copper and fiber cables and deemed "user friendly"). Etc, etc... We are sitting on 20+ years old tech with 8k screens now (don't forget HDMI is effectively single-link DVI on steroids).
Actually, DV allows for both 10 (online streaming services) and 12bit (optical disks). And their preferred way is wrapping the YCC-like 4:2:0 12bit content into RGB 8bit (when their chip is present on the display side to decode it). The other option is transferring it as pre-decoded YCC 4:2:2 12bit (in case of movies where the video is encoded in native DV format, games will obviously just encode their original raw RGB into a different DV transfer format - but BattleField1 is pretty much the only game I remember using this option instead of the RGB wrapping method even when the TV has the Dolby chip [and In remember it having issues when I tried]). Most displays use the DV chip, pretty much only Sony goes for the software-only option. But it's just stupid as is anyways, DV is only good for Dolby to collect fees from manufacturers and content creators (they have ridiculous prices for their studio equipment your need to create DV materials) and to create headaches for some "power users" who hate to be closed behind bars (with proprietary closed standards and fees).
Indeed but sometimes the text still remains sharper with RGB input because this could happen:
original RGB from GPU memory -> conversion to YCC 4:2:2 by the GPU's engine -> conversion into RGB by the TV's processor for some processing -> conversion into YCC 4:2:2 by the TV for other processing.
So this could actually mean extra lossy conversion steps and we didn't even mention that the GPU and the TV might use different logic for these conversions (how the chroma resolution is scaled and exactly what kind of YCC format they select "automagically"), so it's even further from "virtually lossless".
So, "full color information" is only retained if you measure it relatively to the internal 4:2:2 step (going from 4:4:4 to 4:2:2 is a loss of information, you throw half of the chroma pixels away) AND if everything goes well enough (for example, the GPU converts from RGB to YCC in adequate precision because it's never truly lossless, that conversion yields floating point numbers and you round or dither them to relatively low precision integers).
Although, I concur, it looks good enough with the current VGA crads and LG TVs.
Not perfect but good enough, enjoyable for games, sure.
Why are you arguing with me? You must be really bored to blame ME for your REPLY to my post. You're a very inept troll. The fact you can access something does not change what it is. The fact you can access an illegal product, does not then make that product legal. I mean, do I need to dumb the concept down further? (is that even possible?)
You're a very angry person and also demonstrably wrong. That must be a bitter pill to swallow. Everyone can now download Windows 10 using the official media creation tool from the official Microsoft Windows 10 download page that (since the 27th of May) has been serving the 2004 May update as the official current build. You could just say sorry for calling me out incorrectly and move on, or keep throwing your toys out of the pram and entertaining the good users of the Guru3D forums. Your call I guess, but if I were a betting man....
The bet I took with the C9 was that with the next gen of GPUs I could finaly have 4k120 RBG Full 12bit.
Although I'm certain there will be gotchas due to the internal processors, then people will whine, but these TVs will still look a mile better than any conventional one anyway.
According to him, he must be bored (very often, as I gather from reading this forum).
Hopefully we'll see 451.xx soon. When is the next significant release that might warrant a Game Ready driver?