An interesting observation was made today, a crafty contemporary change in the text on Nvidia’s website seems to indicate that NVIDIA has downgraded the requirements for the top of the line G-Sync U... Did NVIDIA silently downgrade G-Sync Ultimate HDR specification?
I don't think it's a question - they definitely did. That being said, I don't think anyone buying an G-Sync Ultimate display is going to do so without knowing the HDR level but it's not like that matters because the HDR level doesn't even mean anything (see below). Also the 1000nit brightness requirement is kind of dumb when you have OLED displays that won't hit 1000 nits but obviously do HDR extremely well. https://displayhdr.org/general/not-all-hdr-is-created-equal/
I agree with you that there generally needs to be a better, more meaningful, standard for conveying HDR support to consumers. If you create a certification standard such as this it should be against the law for you to change it mid-stream. If you need to replace it, or it's not granular enough, then you find another name. The only argument against calling it something else now that it's changed would be that they put all this marketing work and spin into the "G-Sync Ultimate" name in which case, yeah, that's exactly why you should be forced to call it something else. Like the only thing worse than an opaque, useless standard is an opaque, useless, malleable standard. Once again Nvidia manages to frack everyone.
After experiencing HDR for the first time on my 4K TV, and reading a lot about the problems the technology has (and a mediocre performance on PC monitors) I gotta say It looks really great when it is working properly and care has been invested in the technology. I will be buying my first HDR 4K monitor in about three to four years when this technology will be properly supported and implemented. Right now not even 4K is not so mainstream both I think it will be in about 2,5 to 3 years. My money is going on VESA, Dolby Vision will fail...
Yep, I just bought the AW3821dw and its listed as Gsync Ultimate even though it only does DisplayHDR 600. I was pretty confused because it used to be DisplayHDR 1000 with FALD, not anymore apparently.
You're agreeing in one sentence that HDR is a mess and HDR standards right now are useless and then in another sentence blaming Nvidia for not keeping the unrealistic standard that barely anybody is willing to meet thereby making it useless. It's not such a simple black and white situation. A standard is rarely if ever based on merit when it comes to for-profit products and companies, it's simply the number of hands raised by the interested parties of any particular industry. "Yes we can do this, it won't take too much R&D and not eat into our profits" there now you've got a standard. If that group overshot their expectations of what they could do and now cannot meet them then that standard is useless and cannot be met and has to be changed. This is also how LTE became a thing.
Does it matter though? Who cares about Nvidia's stamp? I am sure my two LG OLEDs gsync compatible TVs are superior to any ultimate mere nvidia badged PC monitor. Overrated anything NVidia related as always. PS, I also enjoy my weak Odyssey G7 monitor.