Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 24, 2023.
I'm guessing the only time someone would realistically enable it for 4 frames to be inserted is when the game can't be played at all on the hardware unless it's enabled like say on a laptop.
I agree with all those paragraphs. Only to the last I want to add, I believe there are functions to "check what to render later", I vaguely remember hearing about that at some point. Yes, needs something like that to probably work decently.
Well, there would be input lag, but it wouldn't be stuttery: that would be the main reason to insert more frames between the "real" ones (the real ones would be unchanged, but the stutter would be removed).
As for making the artifacts more obvious... I'd beg to differ: if the "real" frames are maintained at the same rate, the artifacts would still appear in the same amount of on-screen time... but smoother.
This taking into account that they manage to keep the real frames up, and the latency isn't badly affected (though a 20 fps of "real" frames would have by definition a high amount of latency).
There is something that AMD can use to convince devs to use their proprietary tech, consoles!
Assuming consoles can use FSR3...
If the game can only register new player inputs every other 4 frames, it's going to feel stutter to play.
Like I said, in ratios of 1:2 or 1:4, fake frames will be showed on screen during more time than real frames. So we can be doubling the time, or quadrupling, the time that frames with artifacts stay on screen.
Seriously, you make this far fetching conclusions from literally two lines of code, neither of which says what you're suggesting.
That's not the correct use of the word "stutter". Input lag and stutter are 2 completely different things.
The decoupling of "real" vs "AI" would be smoother, so it could in fact be more complicated to discern artifacts on 1:4 than on 1:2 even though technically the "real" frame is shown less time. We won't know until FSR3 is released and analyzed. Heck, even a potential FSR4 could have "recoupling" of frames at some point. Who knows!
My bet is that if so many fake frames are used, it will feel like stutter, because too much time passes between plyer inputs.
How do you figure that it would be "more complicated to discern artifacts on 1:4 than on 1:2"?
Even with 1:1 ratio it's possible to see some artifacts. But with so much frame times for fake frames, artifacts will be much more visible.
That doesn't make sense IMHO.
Because in the 1st frame, if there is any kind of artifact, it will be much more subtle, so there will be a "transition" from "real" to "artifact" that is smoother, and thus maybe more eye-pleasing.
Again... we'll see when it comes out.
We have 1 frame with input. Then 3 frames with no input registered.
This is going to feel like a bad frame pacing, with lots of micro stutter.
The first frame, being the closest to the real frame, will be the one with less artifacts. The second will have more, as there is less information to work with. And the third will be even worse.
And all of these frames combined, will be on screen for longer time than the real frames.
When Scott Herkleman goes 5 minutes without lying.
FSR improvements are great for general consumer on pc cause it work on all cards, but I guess AMD FG being locked to there cards was bound to happen?
Yeah 70-80% of frame screen time displayed without your input is going to feel weird.
I honestly don’t think AMD would be naive enough to use 4 “fake frames” after 1 traditionally rendered frame.
Without clarification, or really anything from AMD, I think it’s more likely to assume they’re generating 4 frames and combining/resolving them to insert as a singular fake frame for potentially better accuracy and less artifacting.
You're glad AMD has taken inspiration from Nvidias exclusivity practices.
Its paying off for nvidia isnt it? Amd needs a selling point and honestly i dont want ampere users to have fsr3.
unless they dont intend to opensource the code as they did with FSR 1 and 2, probably baseless
Honestly he loves Nvidia, he's all over the 4000 series reviews.
We'll see to what extent it will pay off for amd, rdna2/3 do not have frame gen dedicated hardware like nvidia's OFA. But I do agree that they should never even think about including nvidia cards in tech they develop. Just make an effort to pre-plan this and not just release a worse version of whatever nvidia brings later.