AMD FidelityFX Super Resolution in Ultra Quality shows visible quality loss

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 3, 2021.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    This isn't a discussion, you aren't right.

    FSR without DML/Tensor image substitution is just an upscaler and will demonstrate the flaws that come with it.
     
    DeskStar likes this.
  2. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Obviously the context matters here. Can you run neural network inferencing in software on a CPU? Sure, but no one cares about that when it comes to applying it to a real time video game. The entire operation needs to be done in milliseconds - CPUs, hell even non-tensor GPU accelerated inferencing is orders of magnitude slower. So unless you demonstrate yourself or show me some example of a real time application passing it's graphics data through a CPU deep convolutional autoencoder, the argument is kind of dumb - right?
     
    yasamoka and DeskStar like this.
  3. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    Getting through my head is an increased FPS.... At the cost of nothing is what I like.

    DLSS works for me. Allows higher resolutions all the while keeping the fidelity at a high.

    And I guess a "multi-billion" dollar company doesnt/hasn't known for hat they're doing since it's inception.

    Not sure what you're on about here....."sigh" as you put it must mean you're upset over something??

    This is a topic of resolution and scaling. Just wondering why you would say it (AI upscaling) isn't needed if it obviously works.
     
  4. brunok6g

    brunok6g Guest

    Messages:
    4
    Likes Received:
    2
    GPU:
    sli titan x pascal
    FSR generates a lot of motion blur, I prefer to lower the resolution manually, it doesn't make sense. now for consoles that are already blurred it might work.
     

  5. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    Man.... Just stop. It's becoming so comical I'm having issues working over here....

    You must have some serious White pages or some special software sauce you're sitting on that none of us and the rest of the regular world truly knows about.
     
    cucaulay malkin likes this.
  6. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    also,is comparing left and right side of the still frame really "nitpicking" ?
    dafuq
    what else am I supposed to do ?

    exactly my point
    you should have
    cause you omitted half the story
    stop crying like everyone is being rude to you when you're both wrong and condescending at the same time
     
    Last edited: Jun 3, 2021
  7. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    What people seem to be forgetting is the openness of this technology. AMD's master plan is to kill off DLSS and replace it with an open standard that supports everything from the past few generations (consoles, desktop, laptops, tablets, phones, etc).

    Console support for AMD is huge for this, a PC port now will most likely carry over support for FSR so why would devs now waste time and money implementing DLSS when only 2 generations of hardware support it? Sure it gives better IQ thanks to ML but time is money and unless Nvidia code it for the devs themselves or pay them to code it then devs will pass up on it. If you can implement something that supports 5+ generations of hardware 2 generations of console hardware, is said to be easier to implement, and still gives a great performance uplift, you as a dev would be crazy not to use it over the competition.

    Would you rather use DLSS and only support 2 generations of hardware or would you rather open up your game sales to all those RX580/GTX1060 users as well?

    I think AMD will use some form of ML hardware in maybe RDNA3 or 4 that will increase the IQ and the performance of FSR. This will lead to Nvidia having to repurpose those tensor cores to do the same thing and once again the industry will benefit from everyone being able to use and enjoy the feature.

    What I think the initial release of FSR is for is for those people on older hardware, laptops/tablets especially. Are people also forgetting that AMD will now be powering the new Samsung smartphone as well!? FSR will do wonders for mobile gaming, and once again its more broader hardware support.

    I hope the strategy works as monopolies are not something we need, especially right now with how the market is.

    It makes me laugh really, it's like Digital Foundry videos who have made a career out of zooming into a games image 400% and comparing each pixel..... like who actually does this during normal gameplay?

    It's called marketing, both companies do it. Much like Nvidia did with GSYNC using bespoke hardware to achieve basically the same outcome as the open standard VESA Adaptive Sync and we all know how well that went. Same with PhysX locking it off to their hardware, and now the technology has basically just been implemented at a hardware level into game engines and now everything can run it. Same with the Gameworks package, with majority of the effects that just crippled performance on even Nvidia's hardware is now all more or less inside the top game engines and runs on everything. The same will eventually happen with DLSS too, it will get replaced and those tensor cores will be repurposed to improving FSR on Nvidia hardware. Or those tensor cores will go away and FSR will be adopted throughout the industry and the standard and we will probably get better rasterization and rt performance instead, or smaller and affordable chips.
     
  8. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    there has to be an incentive to enable this
     
    CPC_RedDawn likes this.
  9. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    to enable what? I mentioned a lot of stuff lol
     
  10. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    sorry
    fsr
    numbers alone don't mean nothing
    the technology has to be good to begin with
    better have something good in 2 games out of 10 than something you'd never enable in 9 out of 10
     
    CPC_RedDawn likes this.

  11. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    I get what you mean, I think the incentives would be:
    its easy to implement,
    it gives a good performance uplift,
    and most of all it supports more hardware leading too more players getting a smoother experience overall and devs and publishers making more money.

    Sure it has to have good IQ as well, but this is something we will have to wait and see for and give it time, DLSS didn't come out the gate looking all that good either. Looking at screen shots compressed from youtube and coming to a conclusion is just idiotic. We need to see it in motion for ourselves and this time we all get to test it out :)
     
    Richard Nutman likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    it's not ideal,but it's far from idiotic

    how did you judge dlss 1.0 then ? you're saying it wasn't good. based on what ?

    cause it seems to me all people in the thread who make a fuss about comparing a video only saw dlss 1.0 in hardware unboxed videos.
    and if the video shows differences,are they gonna disappear in the game ?
     
    Last edited: Jun 3, 2021
  13. pharma

    pharma Ancient Guru

    Messages:
    2,496
    Likes Received:
    1,197
    GPU:
    Asus Strix GTX 1080
     
  14. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    I wonder if Unreal Engine 5's improved temporal reconstruction/upsampling that they talked about in some of Digital Foundry's recent content would be a better comparison than DLSS 2.1 if AMD's solution doesn't use any form of AI data driven analysis and if AMD's solution lacks the accelerant of the Tensor cores. Do we know whether AMD's solution requires Temporal Anti-Aliasing and/or good motion vectors like DLSS does?

    If anything, perhaps it would be more similar to DLSS 1.9 as it was dubbed given that version did not run on the tensor cores according to DF's Control videos at the time.

    There is a lot you can probably do without the AI driven approach, sure, but like Alex from the DF team said, I would be surprised if at launch AMD's approach matched the quality of DLSS 2.1 since it seems to have a few more disadvantages from the starting line given how it's generalized. Of course it arguably doesn't need to match DLSS 2.1 in quality given it's biggest advantage is that it can run on older GPUs/most GPUs people are using now instead of requiring specific GPU hardware, but it's maybe pointless to speculate until real testing comes out for the feature.

    It took quite a long time after launch for DLSS to make it to the "good" state it's in now -- frankly I wasn't happy with the tech until it reached the 2.0 stage, but 2.1 in something like Metro Exodus Enhanced is very impressive to my eye and I'd love to know more about how it really works. We know a couple things though about DLSS 2.1:

    1) AI driven data analysis / compares low resolution against reference super sampled for the reconstruction
    2) Needs good motion vectors (as per digital foundry) to look it's best
    3) Runs well on the Tensor cores which are well suited to that sort of reconstruction
    4) iirc I read somewhere before (or heard in a DF video, I don't recall) that one of the first things they did was teach the AI to identify and resolve aliasing in the image so in Nvidia's version their AI model is probably doing a lot of work.
    5) It's doing some form of temporal accumulation and may just require TAA is already supported in the engine.

    It "sounds like" AMD isn't taking the AI data analysis approach and the generalized approach means they lack the tensor cores, but I'm not really sure yet what to expect and we'll have to wait for official testing seems to me. They could still do what UE4 and UE5 are doing with temporal accumulation and upsampling, but good TAA implementations are already doing that, no? In any case, it'll be nice to have a good upscaler since upscaling has always looked like crap on flat panels imo.
     
    Last edited: Jun 3, 2021
    Venix and Luc like this.
  15. ChristGeli

    ChristGeli Guest

    Messages:
    7
    Likes Received:
    2
    GPU:
    VEGA FE
    1080p until 4k go mainstream with native resolution.
     
    AsiJu likes this.

  16. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    I am impressed with DLSS 2.1, but there's certainly still a lot of room for improvement seems to me. I'm looking forward to seeing how these reconstruction techniques improve down the line.
     
    Luc likes this.
  17. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    That's really interesting to learn about the different uses for Tensor cores, thanks. Can they be used for RT denoising or anything like that? In the early days of the RTX cards, I'd thought I heard that, but I'm not sure if it ever materialized or if I'm remembering wrong. Obviously they're used to accelerate DLSS, but I'm not familiar with their functions in games beyond that.
     
  18. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    I get the feeling this mostly exists to look good in fps numbers in reviews. You will have to review a game and show charts for quality DLSS and ultra FSR. All of a sudden on those charts radeons are keeping up with geforces and it looks great. The fact that the quality is in no way comparable will be hidden in the small print. For actual serious use chances are the upscalers built into game engines (e.g. UE5) probably work better. At some point AMD will come out with new cards with AI cores and some proper DLSS competitor and this version of FSR will be quietly forgotten.
     
    kanenas likes this.
  19. kanenas

    kanenas Master Guru

    Messages:
    512
    Likes Received:
    385
    GPU:
    6900xt,7800xt.
  20. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,460
    Likes Received:
    397
    GPU:
    RTX 4090 PNY

Share This Page