It has a minimal to none impact at all. Nothing special to blame VSR/DSR. Input lag and worse frame times due to less fps or stutter introduced are irrelevant to network's performing latency.
Have not noticed any. But VSR is not exactly virtual. AMD took wrong approach and did not decouple monitor output resolution from that reported to OS... therefore game. Monitor does downscaling which is not as good as if it was gone by GPU itself, so IQ is worse than it could be. Secondly since it is display that does it, it is receiving images on higher resolution... required higher bandwidth between GPU and monitor... reduces achievable refresh rate. That's not acceptable solution. In is Tiny win and medium/high loss/loss solution. Sacrificing both wings to move step forward.
Nope, VSR is done on a side of GPU. DSR is using shaders, VSR is using GPU display controllers to do a downsampling. Non of those solutions are using panel integrated scalers Regarding quality, DSR have to use Gaussian blur which is eliminating shimmering and moire effects but it's also downgrading output quality
Thanks guys. I ask because when I return to original resolution I find that I aim better (twitch shoot better with sniper rifle) in FPS games. And I cannot figure out why. So my first thought was perhaps latency? Or that my mouse cursor isn't scaling at proper pointer speed do to the resolution. Not sure. But I had to inquire about what was going on none the less. I was scaling the desktop higher then my original screen size if that's any help. But in game I would go back down to my original screen size. Except for Forza 7 I could play that game at 2160.
I used VSR on an ancient game, The Longest Journey, and took a crap looking game into acceptable. FPS was not much of an issue, especially since this is a point and click adventure. Not quite certain that I would use it for anything remotely contemporary though, especially for an online FPS.
As long as the FPS stays the same, the latency will be the same. I have measured VSR and GPU scaling with an high speed camera at Crysis 3, @60 FPS (59.990 to be precise). I took 10 shots for each scenerio, all of them came with the same average; 42ms. The mouse movement feels different because when the resolution changes, visible mouse speed also changes. Plus, not VSR nor DSR user monitor to downscale. They are all done in the GPU. Monitor only does the aspect ratio scaling if GPU scaling is not selected.
Good to know thanks and confirms what I've been experiencing. I'm not sure if the Pointer Speed in win10 actually makes any difference or not when you change it from the middle to something lower with VSR in use.
You mean trying different VSR resolutions? Then yes. Mouse speed is measured in pixel density (pixels / area). When you tell Windows to work at 4K instead of 1080p, the pixel density becomes 4 times higher. If you tell it to work at 1440p instead. Then it bocomes 1.77 times more dense. Same goes for the games, too. However, there is one exception, if Windows scaling is active and higher than %100 then it reports a different pixel density for convincing every app to scale properly. Then you mouse speed can change according to that, too. For example, 1080p with 150% scaling is informed as 1280x720 to the apps, and to the mouse.