Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 5, 2022.
Thought I'd check.... Tried 2 games on my 9900k
BF5 - 160W
Deathloop - 140W
So to me, it looks like most games use well over 100W.
That's some powerful cognitive dissonance there. That's like getting a 700HP V8 and dismissing how much gas it guzzles because "it's not that bad when you're cruising on the highway". Then what's the point of buying it if you're not intending to tap into its full potential?
If the only workloads you put the 13900K under have a reasonable power draw, you might as well just get a lower-end CPU, since you ought to see no noteworthy difference. If you insist there would be a difference then you better go save up for that $500 water loop.
Try the other way around.
You get the V8 and you only have the gas/consumption issues when you're going 200mph on highway. For the rest it's ok.
Dude mine is nowhere close to what you say.
Just picked an avg to quote here, more can be find on that website:
...what? Unless you're on the autobahn, you're not going to go 200 on the highway. If you were to go 200 without being a selfish reckless idiot, then you're actually taking advantage of the car's potential as intended, where perhaps (but maybe not actually) its fuel consumption isn't so much of a problem. Since the chances of that happening for the vast majority of people is pretty slim, my point still stands:
What's the point of paying extra for potential you will most likely never use? Seeing as gaming is your example, you'd be far better off just dumping that money into a better GPU, a better/quieter cooling system, or more storage. With a 13900K, you're either burning money on excess performance (remember: you also need to spend extra on cooling, PSU, and the motherboard) or you're burning yourself by sharing the same room as it.
A 13700F ought to be a bargain in comparison and have a negligible impact on gaming performance.
It's sad to see people having issues with something that one hasn't had any. It always makes me think that I am lucky also. Whilst I've had bad luck with Intel more so then AMD.
Just work I don't believe there is even one pc component maker that makes things that just work.
5 gaming rigs in the house as my whole family games including my wife. Mix of AMD/INTEL/Nvidia, all work majority of the time with different configs and components. My rig is all AMD 5900x/6800XT and works great. After years of Intel/Nvidia, I will probably stay AMD for the next upgrade as well. Don't care about RTX or DLSS as I mostly play CoD/BF and get over 200 fps in native resolutions.
Over 200fps in BF 2042 in 1080p?
Show me and learn me
Maybe I need Amd gpu with my 5950x?
There is no point, except for cinebench fans.
It was also my point and trying hard to justify my next upgrade.
If you look at average game performance for 12k series you'll see a very small difference among all cpu, especially when you go on a higher resolution.
My initial response was for those that keep quote the max wattage in high load/stress which in gaming or usual scenarios is not a problem.
This way we could say I'll never buy a 12600k because it reaches 200W+ in prime, while the chip sits very well at less than 100W in most tasks and also easy to cool.
I watched somebody comment on Putin the other day, you could easily delete PUTIN and insert INTEL
"I dont believe what they say, i only believe what they do".
There is a point - while I take CB results with a grain of salt in terms of how good a CPU is, it does put the CPU under a realistic all-core stress load. Some all-core workloads will run cooler, others may run hotter. CB isn't running unrealistic calculations, so it's a good metric for how hot your CPU is likely going to get under 100% load. If this was Prime95 or stress-ng, I'd say "this is a stupid comparison" since that's not a realistic stress load and is purpose built to make the CPU run way hotter than normal.
Like you said, the game performance across the 12K series is a small difference. You're right that it wouldn't get that hot in gaming, but why get it if that's all you're using it for? It doesn't make sense to buy a CPU knowing >33% of its potential will go to waste. You said "Do you need 2x radiators and custom $500 cooling plus else to game?" but you will need all of that if you intend to do anything more demanding than gaming.
So, if you're trying to justify your next upgrade and gaming is your top priority, I'd say the 13900K is not worthwhile.
people will quote whatever number fits their story. those who cite 300w stress test power consumption numbers on intel will only use gaming numbers for amd.you made a good point about most cpus using sub 100w for gaming these days.the only exceptions would indeed be 11900k and 12900k but they're bad value compared to 11700 and 12700 to begin with.
ahahaha I'm not a cinebench gamer but I do have apps that use x threads per process so using the cpu to 50-100% happens more than "just a gamer" and I found cinebench is relevant for my use, that said....recent ASUS motherboards auto overclocking feature asks you to do cinebench runs, and amd clock tuner ryzen tool needs them too to find your best settings so that annoying cinebench trend has become almost unescapable if you tweak your gaming PC
and my overkill build is used mostly to run my over-modded skyrim which is unrelatable for everyone except a very small niche
have a nice week peeps don't get angry over pc hardware we will all get something for our needs in 21 days
wild alpine ibex say hi
You can keep your BF2042.
BF1, 5 years ago ran 180 FPS with a Titan xp.
BF5 hits 200 with a 12900k and 3090
Bf 5 running 400+ fps with 12900k with 7000c30 ddr5 and 3090
Why did you quote me? Do you love me that much? <3
I quoted you because you keep changing the goal posts to fit your narrative and to provoke people and call people noobs.
BF2042 is the only game in the series that runs like crap and it is even simpler in details then the previous games.
Why would you waste time and effort with this game?
EA pretty much needs to shut down the older games, to get more players to join 2042.
Gaming is my top priority and I'd say it is worth it if you want the best performance. I'll likely throw in a 13900K when it's out
Good post and of course you didn't receive a single like and no one quoted you. Inconvenient truth. AMD fanboys always try to sweep such data under the rug. Instead they fixate on some rumors and extreme overclockings with LN2 350W 450W 500W insanity prime95 synthetic nonsense and keep repeating it like broken record.
I double checked. Previously checked max power use only. But I run BF5 for 5mins. Average is ~120W with spikes over 140W.
My 280mm AIO with 4 14cm fans has to work hard to keep the CPU cool.
Trackmania Turbo only uses 50w.
I don't have FC6 but FC5 averages 85W.