Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.
Not even close and resolution is "defining thing".
lol the techpowerup. They should be banned from here not gamers nexus for being jensen bitches.
What does guru3d review sais?
The difference is a whopping 1.2%.
You're obsessed with VRAM. Actual horsepower is way more important
So why do you keep going on about it?
Because nobody buys a 3080 to game at sub 60fps
Both are important and should be balanced. Even Thanos said it.
If you have a 3080 and getting sub 60fps you're doing it wrong.
4k users have sub 60fps. Im watching @XenthorX streams all the time he have sub 60 with 3090. It is what it is.
Agreed, but adding extra VRAM doesn't compensate for lack of horse power. I mean hell a 5700XT with 4 gigs of vram would easily out perform an RX580 with 8 gigs.
Well that entirely depends on the settings
I think you care about this far more than me.
Love how you use strawman arguments all the time. I dont drop below my 58 fps cap - so all your ramblings are just BS strawman arguments - nothing more, nothing less.
And when people buy a high hz display, it's to make use of it - otherwise they might aswell have bought a 60 hz display in the first place and saved themselfs alot of money. So yeah, you probably wasted your money on that 240 hz display with that 5700 gpu...
No it doesn't.
Yes it is.
According to steam surveys, 3440x1440 is by far the least used resolution - better luck next time.
Im the one who is clueless, yet you cant even get the specs right when given the model number on the monitor...
DLSS 2.0 + RT makes the 6800xt a no go card... unless you are poor of course...
Actually, now, you have done it. Lovely projection.
But let's be honest. If you are maxing out new games, you are running under 60fps @4K. Saying otherwise is lying to everyone around. And in a year, you will be running new games at what on average? 50? 45? 40?
Me with my poor 240Hz 1080p screen. If I drop under 160fps. I'll still have fluid framerate. Yeah, no 240fps in new games. But that's the thing, I can afford to lose 70% of fps in time and all games will still be playable.
1440p can lose 50% of fps and games will still be playable.
But when you start on average around 60fps. You are doing sacrifices day one. Or you have subpar experience and lying about it.
I am not saying that 240Hz is optimal, it is not. Maybe when we get 480Hz screen, I'll stop seeing difference. But I am not saying that 120/144/165Hz is bad. Quite contrary, for most people 1440p with those framerates is optimal. There is HW which can play games without sacrifices, and will be able to do so for quite some time.
It is 4K, which never really got frame rate of the ground unless sacrifices were made.
Everyone knows it, every tech site shows same data. Yet, buyer's bias is buyer's bias. Mistakes happen, people have to live with them.
Acknowledging it and learning does not hurt. Or live in denial.
No correction, but I intend to game at 4K60 with a 80/90 series. I have played a lot of games as is at 4K with a 2070.
Looking forward to doing it better, and with some titles I never really attempted due to hardware limitations.
Again, using strawman arguments. Look through the tests hilbert did of the 3090 - how many of the games averaged at or below 60?
The vast majority of games runs considerably faster than 60 fps at 4k maxed out on a 3090. The only exceptions are valhalla and flight simulator 2020, which run at or slightly below 60 fps. But that obviously doesn't fit your narrative that NO GPU CAN RUN 4k !!!!!!111...
TechPowerUp, nice joke. Why don't you take your horse blinders off and look at some decent and unbiased reviews of these cards, like Hardware Unboxed and Gamers Nexus and you'll see how the reality actually is. It is you that may need tissues then...
So buying a $600+ GPU makes someone poor? What about someone that can't afford to eat? How is that in your eyes? You must really have your head up your a**.
Also time and time again, in the last 3 years, people were proven wrong by AMD, yet they still do this and diss them. They said it's impossible to catch up to nvidia, well look they just did that. Now they say DLSS + RTX is FTW, and in 3-6 months time this discussion will be moot, because they will catch up there too.
Radeon already has better raw rasterization performance when they add their DLSS alternative to the mix, even RT will be faster or at least on par with nvidia's offering. (Not to mention much better OC headroom and performance of it)
How's this gonna look 1 or 2 years for now? That some people complained that they don't have patience a few months... What's a few months compared to years?
Soooo we can argue, but we gotta be civil about it. It's been good so far with a few... Slips but overall let's keep like this.