Oh, I use a 240Hz monitor so my card is basically always going at full blast in games except for Japanese ones which are frame locked out spite because they hate their fans. Well, I recommend NOT ever using a 120Hz+ monitor, because once you do, there is no going back. Ever. 60 fps is often sickening for me to look at now, as in can cause literal nausea, depending on the art style.
I've used 144Hz and 300Hz screens, along with doing 90-120Hz in VR regularly; going back to 60Hz wasn't an issue for me, but I have to use Vsync now in most cases just to make things look smooth.
yup, guru3d review confirm the 300W AMD was smart to disguise their power consumption with their software reading , I haven't seen one AMD owner actually aware of how much power his GPU uses.
Would be funny if RDNA3 turn out to be another Vega , but I hope Nvidia is pressured to price ADA competitively like they did with 1080Ti
Nah, but for now it looks like RTX 4000 might be another Fermi tho.. lol RTX 4090 Ti most likely will be a 550w GPU, as RTX 3090 Ti consumes 100w more than non ''Ti'' 3090.. Seriously they need to do something about this crazy power consumption, what next 700w for RTX 5000? 1000w for RTX 6000? About 25-30% better performance every new gen is poor excuse. Oh yeah I don't believe about those ''twice performance of RTX 3090 claims'' I heard this before, but in reality RTX 3080 was only 20-25% faster than RTX 2080 Super, it was 40-50% faster when using DLSS and RT but not overall. I just hope this time new gen GPU's will be really faster by at least 50%.. Don't care which brand - as long as they can stay under 400w.
avg FPS from TPU review 3080 is 60% faster than 2080 at 4K, 20-25% faster is when you are CPU bottlenecked at 1080p maybe. My 3090 is around 50% faster rasterization than my old 2080Ti when both are using 260W at 4K. I mostly just run my 3090 at 250-300W since I had it at launch, very few title require all the grunts of 3090 actually.
you jelly? Yeah i have been upgrading GPU frequently for the past 20 years, still have enough money for a couple lifetime of GPU . still have my 2080Ti actually, too lazy to sell it
Yeah just rumors at this point but that would be insane if true, I haven't seen a 2x jump in perf in 1 gen since forever.
I will just go with anything that will be at least 30% faster than 3090 but won't eat more than 350-390w
Let's not forget that not all cards can do that. I personally had to reduce the core clock to match the reduced voltage.
lol, fun reading. what most forget, the mass market (you know, the one that actually makes the money for them) gives a rats ass about details like this. they can or cant afford a certain thing (no matter if funds/cooling/power related) and wont care much outside that, the same way no one buying a (fuel efficient) car is (financially) worried about a lambo getting "5mls/gal". (ignoring that a 918 spyder is greener than a prius in mpg and emissions, at almost 6x the output). anyone being in a forum like this should know the card wont run at full throttle all the time, so no, it wont use full tdp 24/7, nor are ppl here new to things like v/g/free-sync etc, so again, most here wont run the max tdp unless they want to.