Will Nvidia Admit someday that they broke Dolby Vision HDR on 4xx drivers?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by 256k, Aug 7, 2019.

  1. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    At last, we can half-agree on something here. :)
    I wish to boycott DolbyVision as a whole from all consumer applications (not just games but movies as well) due to it's licensing structure and all the quirks that brings along. And it's very sub-optimal for video games due to it's fixed iTP YCC 4:2:0 format, whereas HDR10 can handle 4:2:2, 4:4:4 or native RGB (full range). This might not bother everyone but we are wasting the information our GPU already worked hard on to calculate (and most TVs can handle 4:2:2 minimum, or even 4:4:4 in special cases, so this is an unnecessary limit, especially with HDMI 2.1).


    But... speaking of Frotbite and Bioware games, something seems to be wrong with the image quality of their HDR10 implementation. The dark scenes look a lot more alike between SDR and DV. HDR10 looks a lot brighter (no real shadow details, just bright grays with some black pixels). So, even though it's not because DV itself is better (or not) but these games tend to look better in DV than HDR10 (simply because the DV mode is done correctly [image quality wise] while something is wrong with the HDR10 mode). Again, it's not the standards themselves, just how Frostbite/Bioware used them. A dark scene should look almost exactly the same in all 3 modes but it's not how it seems to be in practice.
     

Share This Page