No that is a way too simplified assumption as graphics quality evolves year by year demanding more powerful GPUs. The framerate targets stay at 60~144 FPS, so for the bigger part that range is all that matters for a processor. Game developers choose a target FPS say 60 or 144 FPS, and adapt image quality settings based on the mainstream hardware available. With faster graphics cards comes better graphics. Compare the first and last Tomb Raider and notice what happens. Do you still game at a graphics quality level of 2006? CPU limitation, therefore, is less of a factor compared to GPU limitation, it's also the same reason that so many people still have an older CPU and play their games just fine. You're focussing on the 720p results a little too much, whereas 1440p paints a more realistic picture as ion the en, 99% of the time any PC is GPU limited in games.