Did NVIDIA silently downgrade G-Sync Ultimate HDR specification?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 18, 2021.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,308
    Likes Received:
    8,851
    GPU:
    AMD | NVIDIA
    GSDragoon likes this.
  2. Denial

    Denial Ancient Guru

    Messages:
    13,234
    Likes Received:
    2,725
    GPU:
    EVGA RTX 3080
    I don't think it's a question - they definitely did.

    That being said, I don't think anyone buying an G-Sync Ultimate display is going to do so without knowing the HDR level but it's not like that matters because the HDR level doesn't even mean anything (see below). Also the 1000nit brightness requirement is kind of dumb when you have OLED displays that won't hit 1000 nits but obviously do HDR extremely well.

    https://displayhdr.org/general/not-all-hdr-is-created-equal/

     
  3. EspHack

    EspHack Ancient Guru

    Messages:
    2,640
    Likes Received:
    101
    GPU:
    ATI/HD5770/1GB
    they probably got 1 sample per quarter with such high standards, lets see what happens with miniLED
     
    OrdinaryOregano likes this.
  4. Reardan

    Reardan Master Guru

    Messages:
    369
    Likes Received:
    56
    GPU:
    GTX 3080
    I agree with you that there generally needs to be a better, more meaningful, standard for conveying HDR support to consumers.

    If you create a certification standard such as this it should be against the law for you to change it mid-stream. If you need to replace it, or it's not granular enough, then you find another name. The only argument against calling it something else now that it's changed would be that they put all this marketing work and spin into the "G-Sync Ultimate" name in which case, yeah, that's exactly why you should be forced to call it something else.

    Like the only thing worse than an opaque, useless standard is an opaque, useless, malleable standard. Once again Nvidia manages to frack everyone.
     
    carnivore likes this.

  5. MachinaEx

    MachinaEx New Member

    Messages:
    6
    Likes Received:
    2
    GPU:
    GeForce 1660 Super
    After experiencing HDR for the first time on my 4K TV, and reading a lot about the problems the technology has (and a mediocre performance on PC monitors) I gotta say It looks really great when it is working properly and care has been invested in the technology. I will be buying my first HDR 4K monitor in about three to four years when this technology will be properly supported and implemented. Right now not even 4K is not so mainstream both I think it will be in about 2,5 to 3 years. My money is going on VESA, Dolby Vision will fail...
     
  6. DerSchniffles

    DerSchniffles Ancient Guru

    Messages:
    1,627
    Likes Received:
    112
    GPU:
    Zotac GTX1080ti
    Yep, I just bought the AW3821dw and its listed as Gsync Ultimate even though it only does DisplayHDR 600. I was pretty confused because it used to be DisplayHDR 1000 with FALD, not anymore apparently.
     
  7. OrdinaryOregano

    OrdinaryOregano Master Guru

    Messages:
    434
    Likes Received:
    5
    GPU:
    MSI 1080 Gaming X
    You're agreeing in one sentence that HDR is a mess and HDR standards right now are useless and then in another sentence blaming Nvidia for not keeping the unrealistic standard that barely anybody is willing to meet thereby making it useless. It's not such a simple black and white situation.

    A standard is rarely if ever based on merit when it comes to for-profit products and companies, it's simply the number of hands raised by the interested parties of any particular industry. "Yes we can do this, it won't take too much R&D and not eat into our profits" there now you've got a standard. If that group overshot their expectations of what they could do and now cannot meet them then that standard is useless and cannot be met and has to be changed. This is also how LTE became a thing.
     
  8. NCC1701D

    NCC1701D Master Guru

    Messages:
    236
    Likes Received:
    141
    GPU:
    RTX 2080 Ti
    That's a really nice monitor regardless. I have the prior model and it's been a joy to game on.
     
  9. itpro

    itpro Master Guru

    Messages:
    849
    Likes Received:
    453
    GPU:
    Radeon Technologies
    Does it matter though? Who cares about Nvidia's stamp? I am sure my two LG OLEDs gsync compatible TVs are superior to any ultimate mere nvidia badged PC monitor. Overrated anything NVidia related as always. PS, I also enjoy my weak Odyssey G7 monitor.
     

Share This Page