AMD FidelityFX Super Resolution (FSR) to launch this year

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 18, 2021.

  1. Kosmoz

    Kosmoz Member Guru

    Messages:
    119
    Likes Received:
    72
    GPU:
    GTX 1080
    It would have been great, but look at it this way:

    1. Not many RX 6000 users out there anyway...
    2. They want to make it universal for all games and platforms, instead of lying like nvidia did/does when it gives you a new tech and says it will come and will be great and everywhere... both RTX 1.0 and DLSS 1.0 were non-existent at launch despite advertising them as launched. Only now at 2nd gen you can say they work decent, but still not enough games to be widespread... 2 years later.

    So basically people are mad that AMD did not lie the way nvidia does and say it's working from launch when it's not.... funny.
     
    Devid likes this.
  2. Denial

    Denial Ancient Guru

    Messages:
    13,323
    Likes Received:
    2,823
    GPU:
    EVGA RTX 3080
    All I'm saying is that this was telegraphed for half a decade - but especially in 2017 when Nvidia literally showed the foundation of DLSS at Siggraph... but now, it's 4 years later and AMD still hasn't decided if it's even going to use AI in the graphics pipeline.

    Also FWIW, AMD itself said they were already looking into a DLSS alternative in January 2019 utilizing DirectML. So I don't know how you can say they would have started in Q2 2020.
     
    Last edited: Mar 18, 2021
    PrMinisterGR likes this.
  3. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,816
    Likes Received:
    2,241
    GPU:
    HIS R9 290
    That makes even less sense. A half decade ago, AMD was nearing bankruptcy. Why would it be a good idea for them to dump money into what was effectively a rumor at a time when GPU-accelerated AI was still in its infancy? Seems rather unfair to say they should have known better in hindsight.
    I can't emphasize this enough: DLSS wasn't worth using until 2020. So to put this another way, if PhysX were a success, would it be fair to say AMD screwed up and didn't invest in GPU-accelerated physics? The way I see it, the only reason people are irritated about AMD being behind is because Nvidia [eventually] succeeded in making DLSS a legitimately good feature. If DLSS never got any better than v1.0, I feel fairly confident we would not be having this discussion.

    So all that being said, AMD didn't have a compelling reason to dump resources on this until Q2 2020. AMD took a gamble about whether Nvidia would succeed to make something worthwhile and they lost. Now they're playing catch-up, and it's not realistic for them to get a worthwhile competitor in less than 2 years. Keep in mind, if this technology were so promising, I'm sure Sony and MS would have been pushing AMD much harder to get it done. Both of them promised 4K-capable consoles, and this is a way to get that done.
     
    CrunchyBiscuit likes this.
  4. TimmyP

    TimmyP Master Guru

    Messages:
    609
    Likes Received:
    27
    GPU:
    RTX 3070
    Hi denial!
     

  5. kapu

    kapu Ancient Guru

    Messages:
    4,711
    Likes Received:
    409
    GPU:
    Radeon 6800
    I have BIG LOL at people who own RX6800/6900 and complain about lack or RT/DLSS. I mean really , what have You expected ? That AMD will jump forward in time and develop something like this in a year or less ? They need to make it work on consoles also.
    Please ... RDNA2 was build in mind with raster performance only , nothing else . Want DLSS/RT go nvidia . End of story.
     
    PrMinisterGR likes this.
  6. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,783
    Likes Received:
    1,135
    GPU:
    2070 Super
    Yes, fine.
    But then why is AMD not telling us just that. Instead of teasing FidelityFX Super Resolution.
    Because after teasing it and promising this and that - it turns out they don't even know how to begin.

    We are AMD
    We have the name: FidelityFX Super Resolution
    And we have the aim: Perf/IQ gain by upsampling.
    But fuk if we know how to do it ingame.
     
  7. UnrealGaming

    UnrealGaming Ancient Guru

    Messages:
    2,960
    Likes Received:
    290
    GPU:
    -
    I have a name for it: Its called a hyperloop.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    13,323
    Likes Received:
    2,823
    GPU:
    EVGA RTX 3080
    The problem with this is that plenty of people were saying it in 2018 when DLSS launched, without the power of hindsight. It's now 2021 and AMD still hasn't commited to AI in the graphics pipeline.

    So going back to @Devid's post - I agree with him. They could have had it ready for RDNA 2.0 and they should have. They didn't because like I said, they invested elsewhere.. but that was a gamble they took. We're both saying we disagree with the gamble. That's it.
    Why is this true? Why does it also need to work for consoles? I keep seeing people say this but I don't get why their solution can't be for PC only.
     
    PrMinisterGR likes this.
  9. UnrealGaming

    UnrealGaming Ancient Guru

    Messages:
    2,960
    Likes Received:
    290
    GPU:
    -
    At the end of the day, if we end up with an upsampling solution, better than what's currently out there, that's easy to implement and works across hardware vendors and different generations of hardware, it's a net benefit.
     
    PrMinisterGR and Denial like this.
  10. Aura89

    Aura89 Ancient Guru

    Messages:
    8,161
    Likes Received:
    1,274
    GPU:
    -
    This comparison between intel and nvidia is the most illogical comparison i've ever seen.

    While Intel has for the past 5+ years basically done nothing meaningful to their architecture and just released new CPU with new socket and minimal performance increase, nvidia has been releasing new GPUs with adequate(especially compared to intel) performance increases to very good performance increases, as well as providing new technologies such as ray tracing and DLSS and etc. They put innovation again and again into their cards.

    Does that mean i hope AMD doesn't come out with GPUs that far exceed expectation and dethrone nvidia from the top down? Absolutely not, that'd be great, competition is and always will be good.

    But nvidia is not Intel, and there is zero comparison there.
     
    Maddness, PrMinisterGR and Fox2232 like this.

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,816
    Likes Received:
    2,241
    GPU:
    HIS R9 290
    Well yeah... all these companies have people hopping aboard their hype trains. Y'know how much people hyped up Faildozer too, right? I'm sure you're well aware how much hype can be blown out of proportion and lead to underwhelming results (like DLSS 1.0). Just because people say something is going to be good before release, doesn't mean it will.
    Devid acknowledged "it's easy to say this", because hindsight is always 20/20. But to say things like "they had plenty of time to work on this" or "they should have taken this more seriously" is ignorant, and could only be said because Nvidia won.
    I'm sure both of you wouldn't disagree with the gamble had DLSS remained what it was at 1.0. That's the important distinction here.

    I know this isn't your response to me but I'll respond anyway:
    Sony and MS claimed 4K support. That ain't happening on the hardware they've got now if they expect to have raytracing and better overall fidelity than XB1 and PS4. AI supersampling is a solution to this.
     
    Last edited: Mar 19, 2021
  12. Shakey_Jake33

    Shakey_Jake33 Master Guru

    Messages:
    280
    Likes Received:
    30
    GPU:
    GeForce RTX 2060 OC
    There's an awful lot of valid critisisms that could be thrown at Nvidia, but complacency isn't one of them. In fact, the size of the leap from Turing to Ampere is precisely because they weren't complacent regarding what AMD are working on.

    The onus is on AMD to achieve price and feature parity. They've made huge strides with RDNA 1/2 but they're not there yet.
     
    Maddness and PrMinisterGR like this.
  13. itpro

    itpro Master Guru

    Messages:
    959
    Likes Received:
    515
    GPU:
    Radeon Technologies
    Monarchs, emperors, royals are destined to be dethroned sooner or later. Fate is inevitable. Changes are needed through history. Championships are shareable. I hope for the sake of consumers, Nvidia loses something, personally I do not own anything to JHH. Like most PC users bowing to him.
     
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,804
    Likes Received:
    3,359
    GPU:
    6900XT+AW@240Hz
    It is last thing I would be waiting for.
    Yesterday, few hours before you posted, new AMD's driver added "Anti-lag" and "Radeon Boost (VRS version)" for DX12 titles.

    I think that they are changing things a bit around. They surely are invading low level API's space now. And that means they have people who are able to think how and get it implemented.
     
  15. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,464
    Likes Received:
    115
    GPU:
    GTX Titan Sli
    Presumably because that's the the only option that almost guarantees that their solution will be widely supported. If it's just for PC, the risk is it won't be supported unless AMD pays for it.
     

  16. waltc3

    waltc3 Maha Guru

    Messages:
    1,255
    Likes Received:
    414
    GPU:
    AMD 50th Ann 5700XT
    People should remember that DLSS 1.x was so poor nobody wanted it/used it. It took nVidia quite sometime to come up with what they have now in DLSS 2.0--over a year, IIRC. I would imagine that what AMD will finally unveil will likely be better, I should think, because it is newer, and based on newer hardware that was not the case for nVidia with DLSS 1.x (upon which DLSS 2.0 is based.)

    If at long last I can ever get my 6800XT (been trying to buy one since November last year!) I can run every game at 4k, maxed image quality, with a substantial frame-rate, so I probably won't use what AMD comes up with, either. We shall see. As it is, if I need substantially more frame-rate right now with my 5700XT at 4k I can just drop down to 1440P which completely solves that problem in the 1-2 games in which I might occasionally do that.

    I actually think AMD is way ahead of nVidia in FSAA, unlike Mr. Bonk...;) Also, I'm running everything at 4k, basically, right now, so AMD VSR is another one of those options I don't need/want presently--although it was nice when I was running at 1080P/1200P, I will say.
     
  17. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,849
    Likes Received:
    243
    GPU:
    EVGA GTX 1080Ti SC
    Nvidia's tensor cores are generic for matrix multiplication and therefore machine learning (and specifically neural network) purposes. They're not purpose specific. The only challenge is the size of the neural network used for inference and making sure predictions run fast enough that the process doesn't end up taking longer than just rendering at the target resolution.

    AMD would have a similar challenge. They won't necessarily have a better algorithm just because their equivalent of the tensor core is newer. It boils down to how those cores would perform, not necessarily what they do in particular.
     
  18. terror_adagio

    terror_adagio Member

    Messages:
    42
    Likes Received:
    13
    GPU:
    7950 GTX 512MB
    If they are targeting the whole year, that means they are pretty early with this and are probably having issues.
     
  19. Venix

    Venix Ancient Guru

    Messages:
    1,626
    Likes Received:
    632
    GPU:
    Palit 1060 6gb
    It would be interesting when they release it. People compare it to dlss they expect to look like dlss2 if the first iteration comes out and the image quality is closer to dlss 1.x then amd is in for a big roasting . We will see when and if comes out .
     
  20. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,531
    Likes Received:
    880
    GPU:
    6800XT Nitro+ SE
    The "toggle" switch won't happen, this is meant to be incorporated within the game it self. It won't be a "switch on for 2x performance".

    It's been closely developed with Microsoft and partly Sony. But rumours point to this being a Microsoft and AMD collaboration. I expect all first party Microsoft games to use this. Maybe something like off loading the A.I processing to the Azure cloud servers?? Which would mean it would require an internet connection to function and then you have latency penalties too.

    Freesync wasn't really AMD's technology, it was branding for them supporting VESA's "Adaptive Sync" .

    This is pretty much spot on. Nvidia took a risk and a gamble with DLSS. They needed another way to use their tensor cores which were initially made to do A.I processing for the automotive industry in self driving cars and other A.I based workloads in cloud computing.

    Personally I hate DLSS, even the current version. For some reason I am super sensitive to upscaling of any kind and the image looks way less detailed to me than native resolution, with an almost softer less sharp image. Don't get me wrong, it has improved a lot but its still not there yet.

    People seem to forget both AMD and Nvidia's past technology failures and seem to bow down to anything they shovels out the door. Even though 9x out of 10 they all disappear within 5 or so years. What interests me more is game engine improvements, gpu performance improvements (at native res), API improvements, etc. Things like mesh shaders, and variable rate shading will be what I think will truly push the next gen games. Probably allowing them to run things like RT without tanking the fps at native resolutions.
     

Share This Page