New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,007
    Likes Received:
    230
    GPU:
    6800 XT
    I just ordered myself x570 because I am gonna sell Valhalla to a friend and getting cashback. And then selling my old x370 Asus Crosshair VI might get a decent 50€ for this still.

    Might end up paying total of 40€ for the new motherboard. As I do have 3900x I feel it's a decent investment I want another nvme drive. Hopefully they release the SAM for Zen 2
     
    Embra likes this.
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    Recently, Rick Bergman hinted at a new RDNA 2 GPU feature dubbed as FSR or FidelityFX Super Resolution which happens to be AMD’s response to Nvidia’s DLSS technique, though it will work on an entirely different way.

    Many have been wondering whether AMD will also use a similar AI-approach to this super-sampling technique, but the answer to this is a “no”.



    Pitty... because I think it could be mathematically proven that the best way (IQ/perf wise) to do Supersampling HAS to be AI-like.
    That is just a hunch which I don't know how to put precisely; even defining IQ is anything but straightforward; IQ being subjective and multidimensional quantity being some of the obvious obstacles.
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    On other hand, RDNA2 may like more traditional approach to image quality. AA is memory bandwidth intensive. IF happens to have really high bandwidth.
    Sampler feedback and similar features can be used to actually understand frame instead of analyzing it in post process phase.
    Basically, if GPU knows exactly what matters from frame composition, it may not need AI to put together last fake frame with current low resolution frame.

    Variable shading is not there for sake of having extra name on list of features either. I do not think either camp should be using upscaling unless fps goes to hell.
    And when upscaling is used, it should be enabled dynamically and from resolution at which GPU barely exceeds fps target.
     
  4. OnnA

    OnnA Ancient Guru

    Messages:
    12,554
    Likes Received:
    3,137
    GPU:
    Vega XTX LiQuiD
    ^^ Console approach?
     

  5. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    No GPU will ever like traditional AA. Mostly because they're dead. MSAA died, same like its father - full screen AA died many years ago.
    Meanwhile post AA and temporal methods were born. Needless to say all this happened for a good reason - if they were good, they would still be around.

    Why shouldnt they be using upscaling if it improves the original image? I mean if you don't like it, just turn it off - easy. Dynamically adjusting the rendering - that I like.
    My only wish for DLSS atm is that we dont have to DSR to 4K in order to have DLSS upscaling from 2560x1440.
     
    fantaskarsef likes this.
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    "IF" is only important part of discussion. Image quality at certain performance point. And for now. Most of claims are: DLSS = native. Which is False.
    Same way as: TAA = Native. Still False.

    And as I wrote before. If DLSS was used to take native resolution and upscale it by certain percentage while doing all that clearing, contrast enhancement. Then downsampling back to native, we could have very good image quality improvement. And that could be dynamic as well.
    When game performs above fps target, spare computational power is used to upscale. Or game is rendered at higher than native resolution before it gets upscaled via DLSS and downsampled. Methods dynamic scaling can use are many. And it does not have to be only to gain performance.
     
    OnnA likes this.
  7. Undying

    Undying Ancient Guru

    Messages:
    14,873
    Likes Received:
    3,974
    GPU:
    Aorus RX580 XTR 8GB
    I just wish super resolution would be available for all games not to wait developers implement it in few games like dlss.
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    Will be available to games where AMD sees it fit. Same way as their current dynamic resolution crippling technique.
    If they see that game has no issues with it, you can have reduced resolution while you move mouse.
     
  9. Undying

    Undying Ancient Guru

    Messages:
    14,873
    Likes Received:
    3,974
    GPU:
    Aorus RX580 XTR 8GB
    That again means very limited number of games sadly. Every game that supports raytracing some upscaling technique should be available.
     
  10. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    Are you serious? NO ONE, literally no one even half-sane is arguing that DLSS does NOT improve the IQ of its starting resolution image. That's not even up for grabs.


    That's literally what DLSS is doing.
     
    AuerX and pharma like this.

  11. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    AuerX, fantaskarsef and Noisiv like this.
  12. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    That is what current work is focusing on ... having DLSS feature available in any game using TAA w/o developer intervention. I don't know how this will turn out.
     
  13. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    I think that list is much higher than DLSS list.
    @Noisiv : You did quote following.
    I do not care that resulting 4K image on native 4K screen looks better than source 1080p image which DLSS uses for upscaling on same 4K screen.
    Because it is native 4K screen where I expect to have image quality after DLSS equal or better than native.
    And that can be achieved by taking native, running DLSS over it and downsampling back to 4K.

    Same applies to native 1080p/1440p being DLSSed to whatever higher resolution and downsampled again to native. I prefer DLSS with its small performance impact to improve IQ above what we already have. Instead of degradation of IQ for performance gain.
    But entire dynamic use is about not doing unnecessary degradation when framerate is sufficient. And getting extra image quality where there is spare computational power.

    And no, your statement "That's literally what DLSS is doing." is false. DLSS does not take native resolution as source image and does not improve from there. It takes lower than native resolution and outputs native resolution.
    As far as I am aware, DLSS 2.0 needs "motion vector" like data to combine current low res render with previous DLSS frame. That can be obtained by video encoder in GPU. And considering that encoder will be sourced those low res render images, it will not be overloaded if it needs to look just for motion vectors.

    But that shows weakness of DLSS 2.0 came with. And that's high motion. Be it objects or entire view. As you have no source image to combine in parts of screen that have been covered by moving object or were out of view before player started to turn.

    Kind of bets on same thing AMD bets with their motion based resolution degrading technique "Radeon Boost". And that's possibility that player will not notice lower IQ in motion. And that IQ will be back in "normal" at time camera movement stops.
     
    Last edited: Nov 24, 2020
    PrEzi and Undying like this.
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    Define native resolution please.

    EDIT: Don't. We've been through this, but you're not listening. If your monitor is say 2560x1440 - You can DSR to 4k and Quality DLSS will use 2560x1440 as its starting point.
     
    Last edited: Nov 24, 2020
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    And where is settings which guarantees that 4K uses 2560x1440 as its starting point? Resolution divider is hidden behind name of quality mode in game's settings and you do not know actual numbers.
    And where is the setting that sets final downsampling method?
    Those are hopes and overrides outside of intended use and not part of settings connected to DLSS.
    From day One, people did ask for option to run DLSS in Quality mode which would guarantee native resolution as starting point. That's currently not nVidia's intended use. Which should be clear by lack of this option even while it is just resolution multiplier of 1.

    And we have not been in "define native resolution" discussion before. That would be pretty stupid discussion to start.
     
    PrEzi likes this.

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    Yes, internal resolution is hidden.
    Unless it isn't. Like in Control.
    Or in Wolfenstein, in which you dig around cvars and find out that it's same like in Control, ie 1440p->4K for Quality, 1080p->4k for Performance.

    DSR. You control its sharpness in NV control Panel.

    I dont know what that means.
    If it means that you have to use DSR to force higher internal resolution for DLSS, then yes.
     
    AuerX and pharma like this.
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    And That apparently means hoping in other games behaving same. And other resolutions using same dividers.
    There should always be numbers instead of Names.

    And what is your preferred downsampling sharpness? Hanning? Lanczos? Or some other method?
    Wait, does nVidia give you numbers instead of methods? Anyway, your preferred method is 13-tap Gaussian.

    And you got last thing exactly right. nVidia's intended use is different. Aims for other things than people seeking image quality do.
    They should listen more often. Their engineers are top, but team asking for features is rather absent minded.
    And that's only reason why there are still DLSS discussions. And why some nVidia users admit there is still a lot of space for improvement and others refuse to use it.
     
  18. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,587
    Likes Received:
    1,022
    GPU:
    2070 Super
    So your complaint is that internal res is not clearly indicated? Same here. But lets be honest. You'd still find plenty of reasons to complain, because Nvidia.

    Whats this got to do with anything? Its not like I have madVR to choose from several methods.

    There has been few good articles on the net. But I haven't seen good critique of DLSS 2.0 here tbh. It's either hysterics or omg its magic. And it's far more subtle than that.
    No, I don't think that NV listening to users would do us good, unless they filter out the noise.
    If they listened to those sharpness freaks, we'd still be on good ole MSAA, tanking perf, artifacting and complaining that its killing Radeons.

    If devs listened to those "OMG blurrrr" histerics we'd never get gaming and Anti-aliasing master-piece like this:
    https://www.gamasutra.com/view/news/268722/Limbo_dev_opensources_its_Unity_5_antialiasing_tech.php
     
    Last edited: Nov 24, 2020
    pharma likes this.
  19. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    I hope they do not go this route in CyberPunk. In GodFall they use a very minimalistic DXR 1.1 approach to RT; using hardware acceleration for screen space shadows.
     
    Last edited: Nov 24, 2020
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,503
    Likes Received:
    3,194
    GPU:
    5700XT+AW@240Hz
    ad "because nV". False. AMD gets flak for their failures and deficiencies from me. But they are not pushed around over and over again (especially not in nVidia's section) as superior and "way to go", therefore there is no reason to revisit same discussion over and over again.

    as "Its not like I have madVR to choose from several methods." There are better methods. And I gave in past critique to AMD's downsampling method too. And to their deficiencies and foolish implementation. If you include it as part of solution, then it is not excluded from critique.

    As for the quality. I have no reason to repeat like 20 posts about it from me in last 7 days. Where some went into details of comparison images. Describing, where it does good job. Where it improves. Where it does bad job. And where it fails. No reason to clutter this thread with 20 more posts just because you feel like it. (We have multiple "generic" gaming/videocard sections for that.)

    But I'll say this. TAA you linked will clearly kill any and all fine details. Likely even more than generic TAA. Luckily, game uses right tool for right job. Because that game has no fine details geometry or textures. And it is rather irrational example to support your: "If they listened to those sharpness freaks".
    It is analogical to claim that there are good pixelated games and therefore higher resolutions are not needed.
     

Share This Page