This review doesn't show the full picture, find a review with 4090, thats more like it The 4K benchmarks all plus minus identical across different CPUs here, thats due to GPU bottleneck. With 4090,m the 13900K shines even brighter and especially in 6Ghz with 4090 on waterloop, TVB?
Testing at lower resolutions does the same thing.. I'm not saying that the 13900K isn't the fastest CPU out. It is. But I think for most people the TCCO of AMD just makes it a more valuable platform imo - unless you can afford to upgrade motherboard/cpu every gen. In which case yeah, always buy the fastest - which in this case is Intel. Then just swap to X3D when that launches because it will probably beat this. Then rinse & repeat.
Its ADL refresh and its plenty fast, + first CPU that can do 6Ghz all core with few bios changes, but it doesn't need to Same performance as already plenty fast 12900K at 90W, you can do mini builds with this CPU and tiny heatsink and have same performance as 12900K with AIO What more do you need? ZEN4 is not a refresh of ZEN3 and it fails to win over refresh of ADL that just got a node shrink + tad more cache
When you read 1080p 500fps vs 4K 160FPS is a different feeling [IMO] I honestly dont care about keeping the same mobo for longer than 2 gens, IMHO its an overrated feature. Lets look at AMD, X370 and X470 had PCIe gen 2.0 from chipset, so you couldn't even install Gen 3.0 M.2 [I mean more than 1] Lack of USB 10Gbps and 20GBps, lack of 2.5Gb Ethernet and lack of otehr features When x570 came out it was a must-have upgrade, it had 10Gbps USB, 2.5G Ethernet, everything Gen 4.0, and so on, old mobos looked sad [I had MSI Creation x570 +5950x, this mobo had more than 10 USB ports on the back, i think all USB 10 and +2 USB 20Gbps, had 10Gb Ethernet and so on] Lets not forget about the BIOS issues AMD had, that had some CPU supported and otehrs not Same will be with x670, new stuff will come out, stuff that will be must have like maybe full PCIe gen 5.0 from both chipset and CPU, USB 4.0, even more USB 4.0, maybe we finally move to 5gb Ethernet, maybe more PCIe lanes for M.2 and add-on cards, and so on With Intel, im sure that next chipset will have USB 4.0, it will finally have PCIe gen 5.0 on the chipset and so on, so next year, it will be worth upgrading my z690 [right now i dont need z790, nothing new]
I'm amazed how positive spin some people can take on this CPU - it is a travesty! With such power consumption who would even consider it for gaming over AMD, when it doesn't even win all the time, and for other programs it doesn't win always either nor is that much faster, bur for some individual programs I can see the appeal. The next screenshot from Hardware Unboxed's video review paints the whole picture:
When it comes to CPU sales, does anybody have any idea what % are sold to gamers and what % to every other user. If gamers account for a lesser percentage then that will account for AMDs strategy in releasing the X3D variant later. Its common on a specialist forum for everybody to think that an industry in general revolves around that specialist thing. I wouldn't be surprised if sales to gamers accounts for a small portion of all CPU's sold globally.
Buy AMD if that makes you happy. Buy Intel if that makes you happy. I don't care what you are buying.... You actual failed..... 13900k is a monster in gaming and are beating 7950x by far in fps per watt.... Look at der8auer's newest youtube video... I have both 7950x and 13900k for testing, and 7950x is now unplugged. I find no reason to use it when 13900k is around. Too bad someone is thinking color and name before knowledge....
Never said it's not fast. But like I don't see it anything special it's good. But wouldn't necessarily buy it as DDR5 platform over Zen 4 even if it's a bit faster on expense of even higher power consumption and heat then Zen 4. As a platform I don't see the point if upgrading whole system. But anyway still it's not anything amazing. It's good sure.
13900k and a 4090 paired together are close to 800 watts under load. And that's just for the gpu and cpu by themselves.
Interesting from reviews I've read so far perf per watt 13900K is behind. Tho then again perf per watt 13600 and 5800x3D are easily better. And even 7700x. Did watch Der8auer video and it didn't change much. For sure in some games like bf2042 and far cry AMD fell behind bad but else 7950x at 90w was at times faster. All in all I feel it comes down to whatever one wants to get. If you don't go fiddling much then AMD has upper hand in pert watt for sure and that 5800x3d is a black sheep but it does lose to both in lows.
They are never going to run full power at the same time, unless you deliberately run CPU rendering in the background and game at the same time. If you look at the 13600k gaming performance, it is very close to the 13900k, maybe only core speed difference, indicating that half of the 13900k is sleeping when gaming anyway.
Running 13900k "max oc" on water and 4090 with no powerlimit. Using 760w from the wall My 5950x and 3090 with no powerlimit is drawing 850w from the wall in Battlefield 1
Everything else I've read today shows that the 13900K max power draw is at least 100W higher than the max draw of the 7950X! My God...well, the really great thing about the 13900k is that we should never hear anything else detrimental about the 7950X's power draw of 230W (TDP & SP)! I don't see why anyone should be surprised, looking at the inferior CPU manufacturing process Intel has to use at the moment. Didn't surprise me. I can't agree with HH that this is "somewhat similar" to what we saw with the 7950X--as 100W+ higher is not "similar" at all... I just don't see it. Rather, it should serve to put things into perspective for most people, I think.
It's dumb but funny. I feel like dunia engine and frostbite do not like AMD CPU. My wild guess is latency or just something hardcoded / how it's threaded
Farcry 6 also runs fairly bad on 13900K, the lows are much worse then 7700x lows in Gamers Nexus test. So it does not run worse on AMD then Intel, it runs worse on more then 8 core CPUes in general, no matter the brand.
That is nice. I am guessing it's the 3090 vs 4090 that makes the difference so huge. Since for sure 5950 should be using less. Ofc there are other things like what is the rest of the pc like HDDs and stuff but shouldn't make a huge difference.
The question is does that much of power really neccesary for gaming?I think thats true waste of money u can spend extra cash on the gpu.