False. My methods are perfectly suited for real comparison. Explained all multiple times. Not hard to understand them at all. For example, physically smaller screen can get away with lower fps. Because in motion objects travels smaller physical distance on screen. They still move same number of pixels if both screens in comparison have same resolution. Like android devices being historically locked to 30fps gaming. Not many people would notice with so small screens. Only in recent years, some games got 60fps mode. And only recently we are getting android devices with 90 and now with 120Hz screens. For cellphone, 90/120Hz is overkill unless person wants to use device in VR headset. When I trigger motion which results in object moving 2cm per frame on my screen at 240fps, it translates to 8cm per frame for 60Hz screens no matter how many fps game can produce as long as screen size is equal. And it is same for image quality comparison for screens of same size. When you have 1440p screens and run whatever details you can while game runs stable 60fps, and then take 4K screen of same size with same HW and use settings which get you same 60fps... you effectively removed temporal part of comparison and can perform IQ comparison. - - - - What you described is only true in case of unlimited power. And that's whole problem. Not even 3090 has that. I made a joke on power draw of Suprim card, but that does not translate to gaming. Power is not there. If I blown it out of proportion to demonstrate it: Chose DX12 + DX-R maximum IQ on 1080p or maximum DX9c on 8K or DX7 (No shaders) on 16K. Sure, maybe in 20 years when we have 32K screens, everything will be about tiny polygons and things will return back to times of simple TnL. But I doubt it.