NVIDIA RTX Video HDR: Enhancing SDR Content with AI-Powered HDR Conversion

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 25, 2024.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,199
    Likes Received:
    18,151
    GPU:
    AMD | NVIDIA
    chispy, Valken and fantaskarsef like this.
  2. Marios145

    Marios145 Active Member

    Messages:
    59
    Likes Received:
    13
    This is a weird direction we are taking in graphics, video etc, more and more into filters and content conversion instead of focusing on producing real quality, real frames
    Imagine telling someone from 2005 that in 2023 motion interpolation, downscaling in games is now a valid reason to select a GPU and we consider it "good enough"

    Back then, some blurry textures were enough to accuse a vendor of "cheating"

    Going from SDR to HDR is impossible - low shadow contrast as a feature?? you can maybe apply a math curve
    It takes huge compute power to detect what should be "brighter" and what should be "darker" on limited information(8bit, sRGB) and extend that to 10/12bit and P3/2020 color, it will just end up with artifacts or lower quality.

    Maybe they should have focused on a new standard for SDR similar to HDR (peak nits, display metadata) for auto-calibration assisted by DL/AI
     
    H83, pegasus1, Solfaur and 1 other person like this.
  3. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,596
    Likes Received:
    3,400
    GPU:
    NVIDIA RTX 4070 Ti
    It's because GPU's are no longer just Graphics Processing Units, with everything they can do these days.

    I wonder if in the near future a new name will be given to them.

    That said, NVIDIA released the Beta of the Remix Toolkit a couple of days ago, which is the biggest thing happening for gaming in decades. They havn't forgotten about visuals and gaming.
     
  4. AuerX

    AuerX Ancient Guru

    Messages:
    2,395
    Likes Received:
    2,171
    GPU:
    แกงมัสมั่น
    I dont see the point in comparing back then to now in a very fast moving industry, change is inevitable and the direction is constantly changing as things progress.

    That said, you post was well thought out.

    But in the end, really the only thing that matters is what we see on our monitors, how we get there is irrelevant.
     

  5. Valken

    Valken Ancient Guru

    Messages:
    2,908
    Likes Received:
    894
    GPU:
    Forsa 1060 3GB Temp GPU
    I use Reshade on my video player and it works incredible. Using AMD Adaptive Sharpen on 1080p content adds bits of details or noise, but looks great.

    I think it is a good option to have. If you don't like it turn it off.
     
    XenthorX likes this.
  6. Nekrosleezer

    Nekrosleezer Master Guru

    Messages:
    219
    Likes Received:
    42
    GPU:
    RTX 4080 Suprim X
    It all started the day CUDA cores became programable, today we have more Cores (like RT or Tensor cores) to do more things, and you are not forced to use them if you don't like them, but telling to the three main graphics companies (Intel, Nvidia and AMD) that this is not the right direction because it looks fake to you seems that you are the person living in 2005 still.
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,924
    Likes Received:
    4,301
    GPU:
    HIS R9 290
    I don't see the issue with compute power if the SDR content is just a video and not a game. I'm sure even a 3050 would have enough processing power for this.
    You're right that you can't really pull detail out of nowhere, but Nvidia's AI projects are pretty damn good. DLSS, while not a valid excuse for improperly optimized games, does an exceptionally good job at filling in information where there is none. Their noise-canceling AI is similar, in that your voice could be badly drowned out by background noise and yet it manages to do an excellent job at reconstructing it. I don't think it's especially difficult for a well-trained AI to intelligently and dynamically increase the contrast ratio of certain objects.
     
    pegasus1 likes this.
  8. LimitbreakOr

    LimitbreakOr Master Guru

    Messages:
    620
    Likes Received:
    158
    GPU:
    RTX 4090
    I tried it and it looks very weird in most videos i've seen. In some videos the men looked like they were wearing lipstick (it wasn't a woke video). Everything generally looked kind of off colour and weird.
     
    Rubalvar likes this.
  9. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,639
    Likes Received:
    2,623
    GPU:
    Aorus 3090 Xtreme
    I suppose you 'can' put lipstick on a 3050, but they dont have men inside ... normally :confused:
     
    pegasus1 and AuerX like this.
  10. alanm

    alanm Ancient Guru

    Messages:
    12,197
    Likes Received:
    4,368
    GPU:
    RTX 4080
    I'm more interested in what it can do in 5 years from now rather than whatever imperfect state it is in the beginning.
     

  11. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,012
    Likes Received:
    3,377
    GPU:
    MSI 4090 Suprim X
    AutoHDR isn’t anything new in itself, it’s been in TVs for years and is quite necessary when end user has a HDR capable screen but no HDR content.

    True HDR capable monitors are most likely going mainstream in 2024-2025.

    Clearly we need something on PC to leverage this hardware given the large majority of content out there being SDR.

    No doubt AI trained model can achieve interesting HDR results (that’s how modern TV do it anyway, got it on Samsung tvs).

    Really want to test it out against native HDR content.
     
  12. van_dammesque

    van_dammesque Active Member

    Messages:
    54
    Likes Received:
    22
    GPU:
    Zotac 3070
    Can't get it to work, always says 'inactive'.
    Using K-Lite Mega codec pack and latest nVidia drivers if that helps.

    I remember fake HDR with Far Cry.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,197
    Likes Received:
    4,094
    GPU:
    EVGA RTX 3080
    It only works in web browsers.

    HDR in Farcry/HL2 Lost Coast refer to a completely different thing.
     
    van_dammesque likes this.
  14. van_dammesque

    van_dammesque Active Member

    Messages:
    54
    Likes Received:
    22
    GPU:
    Zotac 3070
    Ahh right, thanks for the clarification.

    [/QUOTE]HDR in Farcry/HL2 Lost Coast refer to a completely different thing.[/QUOTE]

    Ofc different things but similar in the sense of giving 'upscaling' content.
     
  15. Venix

    Venix Ancient Guru

    Messages:
    3,398
    Likes Received:
    1,918
    GPU:
    Rtx 4070 super
    Well if we are honest I prefer such ai applications than ai wifus ......
     

  16. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,018
    Likes Received:
    3,424
    GPU:
    TUF 4090
    My video editing software uses around 70% GPU to 30% CPU when I render 4k content. From 1080Ti to 6900xt to 4090 gave me a bigger uplift in rendering performance than going 2700k to 3800x to 5800x3d.
     
  17. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,018
    Likes Received:
    3,424
    GPU:
    TUF 4090
    I was very skeptical until I ran through all the graphical variations and options in CP2077, for me the fake frames debate is settled in that game, the fake frames not only look better but they also perform better.
     
    geogan likes this.
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,924
    Likes Received:
    4,301
    GPU:
    HIS R9 290
    To my understanding, the faster the framerate, the more acceptable the fake frames become. It makes sense when you think about it - depending on how specifically it's done, the AI either has more motion data to work with, or, there's just fewer gaps of information between frames (or both). I'm sure if the native frame rate was in the 40s or lower, you might not share the same sentiment. Same sort of thing why upscaling can sometimes look better than native if your starting resolution is 4K (or close to it) - when the resolution is already very detailed, the AI doesn't really have to fill in any gaps, it just has to enhance the image further.

    Think of it like this:
    If you had a group of artists who never saw Mario before and you told them to make a detailed 3D rendition of Mario from NES, you're going to get pretty different results, because there's just not a lot of detail to start with. Tell them to do that from N64 and you're going to get a lot more consistency, but there may be some slight stylistic differences. With the Switch, there's just not really much room for improvement - basically just make him look less plasticky and perhaps increase the polygon count. You could even just make him look more anatomically realistic, since there's now enough detail.
     
    LimitbreakOr and pegasus1 like this.
  19. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,018
    Likes Received:
    3,424
    GPU:
    TUF 4090
    I ran through all the different visual options from 60fps locked to unlimited (4090@4k), using both the ingame bench and normal gameplay, I also took some screenshots.
    I settled on all the visuals to maximum with FG etc and DLSS max quality. I also set the max FR ton75 to keep frametime constant. For me visually it's better looking than native.
     
  20. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    3,201
    Likes Received:
    422
    GPU:
    ASUS 4090 TUF OG OC
    We are in the days of "the winning win more" with all the current GPU software tech. You look at DLSS/DLAA, DLSS FG, DLSS RR and they all work better when you are already winning. Somebody with a 4060 trying to play at 1440p 144Hz Ultra settings will kill their image quality with each additional modern tech they enable, due to coming from a low base frame rate. Same situation with a 4080/4090 works out so much better due to having much more information for each stage to work with. DLSS Q looks better because you have more base frames, DLSS FG looks better because you have more base frames, RR looks better because you have more base frames, etc....

    I remember many years ago when you aimed for a certain frame rate and then, based on your hardware, were able to use things like MSAA/SGSSAA to make things look better. No real downsides (keep in mind the time and the tech ~20 years ago) other than performance being reduced which was fine if you had overhead. Now we go "okay, well you can't maintain 60 so you enable this thing which brings these compromises, then you turn this thing on which brings those compromises, then you put the cherry on top by trying to denoise and reconstruct that mess of pixels" and off you go! :p

    On topic, I've tried the RTX Video stuff on all 4 stages and while I can see a bit of a difference on stage 4 with 720p YouTube content, it just isn't worth the power draw. It felt mostly like trading the natural artifacts for manufactured artifacts so I'll keep it off for now. Perhaps it can develop over time like DLSS.
     
    schmidtbag, pegasus1 and H83 like this.

Share This Page