Did NVIDIA silently downgrade G-Sync Ultimate HDR specification?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 18, 2021.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,325
    Likes Received:
    18,408
    GPU:
    AMD | NVIDIA
    GSDragoon likes this.
  2. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    I don't think it's a question - they definitely did.

    That being said, I don't think anyone buying an G-Sync Ultimate display is going to do so without knowing the HDR level but it's not like that matters because the HDR level doesn't even mean anything (see below). Also the 1000nit brightness requirement is kind of dumb when you have OLED displays that won't hit 1000 nits but obviously do HDR extremely well.

    https://displayhdr.org/general/not-all-hdr-is-created-equal/

     
  3. EspHack

    EspHack Ancient Guru

    Messages:
    2,794
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    they probably got 1 sample per quarter with such high standards, lets see what happens with miniLED
     
    OrdinaryOregano likes this.
  4. Reardan

    Reardan Master Guru

    Messages:
    632
    Likes Received:
    209
    GPU:
    GTX 3080
    I agree with you that there generally needs to be a better, more meaningful, standard for conveying HDR support to consumers.

    If you create a certification standard such as this it should be against the law for you to change it mid-stream. If you need to replace it, or it's not granular enough, then you find another name. The only argument against calling it something else now that it's changed would be that they put all this marketing work and spin into the "G-Sync Ultimate" name in which case, yeah, that's exactly why you should be forced to call it something else.

    Like the only thing worse than an opaque, useless standard is an opaque, useless, malleable standard. Once again Nvidia manages to frack everyone.
     
    carnivore likes this.

  5. MachinaEx

    MachinaEx Member

    Messages:
    28
    Likes Received:
    7
    GPU:
    GeForce 1660 Super
    After experiencing HDR for the first time on my 4K TV, and reading a lot about the problems the technology has (and a mediocre performance on PC monitors) I gotta say It looks really great when it is working properly and care has been invested in the technology. I will be buying my first HDR 4K monitor in about three to four years when this technology will be properly supported and implemented. Right now not even 4K is not so mainstream both I think it will be in about 2,5 to 3 years. My money is going on VESA, Dolby Vision will fail...
     
  6. DerSchniffles

    DerSchniffles Ancient Guru

    Messages:
    1,665
    Likes Received:
    148
    GPU:
    MSI 3080Ti
    Yep, I just bought the AW3821dw and its listed as Gsync Ultimate even though it only does DisplayHDR 600. I was pretty confused because it used to be DisplayHDR 1000 with FALD, not anymore apparently.
     
  7. OrdinaryOregano

    OrdinaryOregano Guest

    Messages:
    433
    Likes Received:
    6
    GPU:
    MSI 1080 Gaming X
    You're agreeing in one sentence that HDR is a mess and HDR standards right now are useless and then in another sentence blaming Nvidia for not keeping the unrealistic standard that barely anybody is willing to meet thereby making it useless. It's not such a simple black and white situation.

    A standard is rarely if ever based on merit when it comes to for-profit products and companies, it's simply the number of hands raised by the interested parties of any particular industry. "Yes we can do this, it won't take too much R&D and not eat into our profits" there now you've got a standard. If that group overshot their expectations of what they could do and now cannot meet them then that standard is useless and cannot be met and has to be changed. This is also how LTE became a thing.
     
  8. NCC1701D

    NCC1701D Master Guru

    Messages:
    269
    Likes Received:
    172
    GPU:
    RTX 4090 FE
    That's a really nice monitor regardless. I have the prior model and it's been a joy to game on.
     
  9. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    Does it matter though? Who cares about Nvidia's stamp? I am sure my two LG OLEDs gsync compatible TVs are superior to any ultimate mere nvidia badged PC monitor. Overrated anything NVidia related as always. PS, I also enjoy my weak Odyssey G7 monitor.
     

Share This Page