No, you pretty much did not get anything quite right. I wouldn't have bought the 1080 for 4k because everyone knew it wasn't a good 4k competitor. When your best case is maybe 60fps, you're not a 4k card. I don't know why you felt the need to reiterate that but let's move on. I bought a card that was on sale and was at the correct price, not the price blockchain pushed it to be. Later on there was a BF sale on 4k monitors so I got 2, (I know, overkill, but gaming isn't my main profession and they're useful for that). Problem is now that because of the way windows handles resolution changes it forces me to play on 4k or rearrange everything on the screen every time i close a game. So to make it clear 4k gaming was never my aim, 4k desktop was. Nvidia came out with a solution that fit their architecture over the resolution scaling we already had. They were in a comfortable lead and they spent a generation working on tensor cores and RT cores because this is the future they envisioned for the platform. AMD kept grasping at straws and even now AIBs can't sell the graphics card for what they promised because they were already at knifes edge on profits. The less we talk about the tech AMD implemented in the latest cards the better, it's embarrassing enough as it is and I'm not here to bash AMD. I'm not particularly happy about the situation either. Nvidia was always a closed of sort of platform and why shouldn't they be otherwise? They were the ones spending money on creating the implementation so obviously they wouldn't just give it away. AMD came around to it later but at that point having it open source was the only way to make their implementation be used by the devs. For me it's a simple equation. I want something that will last me the next 5 years since I have other pans in the meantime - I don't need the latest graphics then. - I don't care as much about the "fake pixels" that you are so obsessed about. - I want something that has the best chance of offering me a good experience not only now but later on. - I'm definitely not interested in 8k or HDR or whatever idiotic thing they come up in the future. Not unless windows fixes it's crap or Linux actually becomes a serious contender as a desktop environment when it comes to media. If I'm going to take a gamble on the long run between NVidia and AMD there's no question who I'd pick. Because one thing's for certain, the chance of windows getting it's poop in a group is a bit unrealistic.