AMD FidelityFX Super Resolution 2.0 - Deathloop preview

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 12, 2022.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,317
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
    fantaskarsef, Maddness, Embra and 2 others like this.
  2. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    I've been watching comparison videos and reading articles (including this one), seems like this is a huge leap over FSR1 but still falls a little short of DLSS in motion. That being said I don't know how Nvidia can justify dedicating die space for tensors when the quality of DLSS is only 5-10% better than this and basically no other feature uses them. Either Nvidia needs to put more value-add into Tensors or they need to go a different route with DLSS. Perhaps we will get a DLSS 3 with larger upgrades or some other features that utilize tensor with next gen but at this point I'd say AMD has parity here.
     
    Last edited: May 12, 2022
    moo100times, Silva, Legacy-ZA and 4 others like this.
  3. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    Perhaps not as good as DLSS but it's ease of implementation and wide compatibility makes up for the loss in quality. Much like with DLSS, if you've got your face right up to the screen, not moving, and constantly switching between full detail and upscaling, you will notice a difference. But if you actually play the game, such subtle details will totally go unnoticed. In a lot of games, FSR2 or DLSS2 are perfectly usable. In other games, they're unacceptable. It all depends on what you need. Same can be said of all settings that lower game detail. Whether it's shadows, reflections, texture details (only when VRAM is maxed out), anti-aliasing, resolution, etc. For some games, lowering such detail settings makes an insignificant visual difference but a major performance improvement. For some of those settings, the loss in detail is unacceptable while in others it goes unnoticed.

    As far as I'm concerned, FSR and DLSS aren't supposed to be on by default. They're just another way of lowering visual fidelity for more performance, just like all other graphics settings. It's weird to me how much people make a fuss about this as if they're supposed to use it, or as if it's supposed to have a 1:1 level of detail.
     
    Last edited: May 12, 2022
  4. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,578
    Likes Received:
    10,607
    GPU:
    RX 6800 XT
    HU review showed that DLSS 2.x has a bit more ghosting than FSR 2.0
    But has a bit worse reconstruction in movement.
    So it's a bit of a trade off.
     
    BlindBison likes this.

  5. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,974
    GPU:
    RX 6750XT/ MAC M1
    Keeping in mind that this is open source and fully software-based, can work with practically any GPU, and doesn't require dedicated die space for it to work is a pure win. Well done, AMD.
     
    moo100times, Kaarme, Silva and 6 others like this.
  6. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,390
    Likes Received:
    3,064
    GPU:
    PNY RTX4090
    This is what I have been saying for years. Nvidia could of easily done this themselves with their insane R&D budgets and huge work force. But no, they create needless custom hardware and then charge the customer for the privilege of using it trying to create an even larger monopoly on the market. Much like they did with GSYNC and their custom hardware inside each monitor when they could of just supported Adaptive Sync, or buying PhysX off Ageia and locking it to their hardware only. Then community members were able to enable PhysX on AMD/ATi cards with modded drivers, proving again that Nvidia lied when they said it was only possible on their hardware. Now PhsyX is used inside tons of game engines.


    Nvidia make exceptional GPU's, but their business practices are just ludicrous. AMD are not innocent either they have done a lot of shady crap too just seems to be a lesser extent and the immense support of open source software just benefits the industry as a whole.

    EDIT:

    ***grabs popcorn***.....
     
  7. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090

    May end up seeing this down the line.

    FSR 2.0's quality mode definitely delivers here, much better than 1.0. Good job AMD.
     
  8. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,251
    Likes Received:
    1,758
    GPU:
    7800 XT Hellhound
    It's ok for 4k, but with 1440p screen I still find it too rough vs. DLSS. And that is with DLSS Balanced (+ReShade CAS) vs. FSR 2.0 Quality. FSR 2.0 Balanced already looks very bad with 1440p target resolution.
    Looks like DL isn't all about marketing BS.
     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Idk lol there's some rewriting of history here. GSync was out nearly a year before Freesync and years before Adaptive Sync. It provided more features than Freesync, even 2 years after Freesync released, and it also provided a more standard platform. You knew what you were getting when buying a gsync monitor. Freesync feature support was all over the place back then.

    PhysX, AFAIK, was never enabled on AMD/ATi cards. You may have been thinking when Nvidia locked PhysX on it's own cards, when an AMD card was in the system - that was bypassed, but I don't think anyone ever got PhysX to actually run on the AMD card itself. I could be wrong. Either way PhysX has been completely rewritten like 4 times, so it now running on CPU in various games is kind of meaningless.

    Tensors are a product of HPC shoved down into consumer GPUs. Most of that R&D was paid for by Nvidia HPC. Either way I don't think there's a reality where Nvidia releases a non-tensor card that's cheaper. We would have just gotten the same cards, with DLSS that doesn't use tensors, for the same price. Also while tensors aren't used for much stuff now (DLSS, Denoising in RT, Microphone cleanup stuff, some picture enhancements, etc) there's definitely a massive advantage with having them for potential future applications. Ignoring obvious improvements to current features (who's to say DLSS 3.0 won't massively increase performance?) DirectML is starting to take shape and it can leverage those cores. I think having the hardware creates a potential that AMD doesn't have. Whether or not we'll see something that utilizes that potential is a different story.. but Nvidia is king of value-add so I'm sure it's coming.

    Also Nvidia does have a similar solution, it's NIS and it's also open source but it's not a temporal solution - it may become one though in response to this. Dunno.
     
  10. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    Nobody noticed yet that FSR2q (all FSR2 modes) either kills, or they have disabled particle effects? FSR2q shows no sparks, faeries, or added flames at all at 4k. Its impressive for static geometry Ill give it that.
     

  11. Krizby

    Krizby Ancient Guru

    Messages:
    3,047
    Likes Received:
    1,705
    GPU:
    Asus RTX 4090 TUF
    DLSS and FSR2.0 allow for playable 4K gaming on mainstream GPU, I can assure you that 4K DLSS/FSR2.0 look way better than 1440p Native :)
    Sure if you plan to stick to 1080p forever then yeah DLSS/FSR2.0 are not for you.
     
    BlindBison, Undying and Aztec2Step like this.
  12. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Why does this PhysX BS still exist?

    ATi chose not to work with Nvidia on Physx, which is why Nvidia locked them out. Ati chose to partner with Havok for their physics implementation which bore no fruit. It's a fact that Ati themselves were against Physx and hardware accelerated physics in the first place, where as Nvidia saw the potential and made it mainstream. Mainstream enough that even consoles used PhysX. Your version of PhysX history is just wrong.

    What we really see with Nvidia is that they set the standard and Ati/AMD are always playing catch-up. DX12 is a testament to what happens when you let AMD push their agenda. You get BS like Async-compute, which AMD are more than willing to pedal as the next big thing, when in reality it was a last minute change in the DX12 specs that Nvidia weren't prepared for.

    The fact we now have Ray Tracing, even in a hybrid form is down to Nvidia. Now AMD has RT. DLSS also Nvidia, now AMD has FSR. If it wasn't for Nvidia those features wouldn't even exist.
     
  13. =GGC=Phantomblu

    =GGC=Phantomblu Member Guru

    Messages:
    188
    Likes Received:
    65
    GPU:
    Radeon RX 6900XT



    Do you ever wonder why AMD never took the path of NVIDIA? Do you really think that AMD is so stupid ... I remind you that it works hand in hand with Microsoft for the consoles, Nvidia has worked on it yes and no once if I'm not mistaken and it is the only one that fully supports Open Source even with Freesync , FSR and so on .. Explain to me why in the so-called gaming range of cards it does not have the Deep Learning that Nvidia has with DLSS ... and wanted a more friendly approach
     
  14. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    It's nothing to do with intelligence, it's called lack of resources, otherwise known as lack of money.
     
  15. =GGC=Phantomblu

    =GGC=Phantomblu Member Guru

    Messages:
    188
    Likes Received:
    65
    GPU:
    Radeon RX 6900XT

    You believe this too because if you go to KRONOS you will see that the developers proceed in the implementation of the standards together .. So the knowledge is more or less the same ... If the AMD opencl standard is stuck at 2.0 while Nvidia about drivers is 3.0 and not because AMD chases it ... It is probably believed that the exploration of certain areas still has margins ...
     

  16. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Because it didn't have the money to build a multi-billion dollar AI software ecosystem. Pretty much all the technology for RT/DLSS are just consumer variants of HPC/Professional technology. If you go and look at any of the white papers for RT/DLSS(convolutional autoencoder)/etc its' pretty much all developed while looking for solutions to datacenter problems or in RT's case self driving cars.

    I don't recall stating AMD is stupid? In fact I praise them in my first post in this thread.

    I'm not sure what this has to do with anything. Microsoft isn't really known for being a company that's innovative. Most of these features are developed by AMD/Nvidia and then those companies work with Microsoft on implementing broader support through APIs.

    Okay?

    AMD does have acceleration for tensor operations on it's consumer cards.. it just doesn't have discreet hardware for it and for all we know that changes next gen - CNDA1 and now 2 have Matrix cores. It's just known if they will hit consumer variants. They've also been working with Microsoft on developing AI upscalars. Their efforts just haven't bared any real products.
     
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    You kinda missed my point...
    DLSS and FSR are just one many methods of sacrificing some detail in order to either attain higher refresh rates or higher resolutions (or both), which is fine. Same goes for any other form of graphical settings you decrease. Some settings have more of a performance impact than others. Some settings have a greater visual impact than others. It all depends on the game and personal preferences. In most cases, I would argue DLSS and FSR yield a much greater performance gain than there is a fidelity loss.
     
  18. Undying

    Undying Ancient Guru

    Messages:
    25,206
    Likes Received:
    12,611
    GPU:
    XFX RX6800XT 16GB
    Less ghosting with fsr2.0

    [​IMG]
     
  19. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    You could look at differently and say that DLSS is a method to get more details at playable framerates, running those heavily ray traced games is not a light task.
    Just saying, the increase is in some cases higher than the decrease that DLSS/FSR brings along.
     
  20. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    What version of DLSS is this game using now? I know when it shipped most people complained about the ghosting and manually updating fixed it but did they ever officially update the dll?
     

Share This Page