Intel Core i9-12900KS just 9% in faster Cinebench R23 Multi-Core compared to 5950X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 18, 2022.

Thread Status:
Not open for further replies.
  1. user1

    user1 Ancient Guru

    Messages:
    2,805
    Likes Received:
    1,324
    GPU:
    Mi25/IGP
    intel's new power gating tech+thread directing tech is definitely impressive(more so for avx-512 ironically), however I feel that Its not quite as impressive when you consider that alderlake launched a year later, with a brand-new socket, with brand-new memory tech and is on a monolithic die, all of which saves them a considerable amount of power from the start. In this case imo its more impressive that amd's product is able to compete as well as it does despite being 'old' normally when you see new product, its "new product is better than old" in a more dramatic fashion.

    as for playing cinebench, everyone always wants to see what products/things/people do when they are pushed to their limits, rather than when they are lightly loaded, even if its a more relevant workload, a quirk of human psychology I think.
     
    Agonist likes this.
  2. Agonist

    Agonist Ancient Guru

    Messages:
    4,291
    Likes Received:
    1,321
    GPU:
    XFX 7900xtx Black
    They were not slow in games for the actual price point they had. They did exactly what they were designed for. And people expected first gen ryzen to defeat intel out of the gate, and those people were stupid.

    And your breakdowns and understanding of what Aura89 said if baffling. You are stuck in left field why we all are centerfield.
     
  3. nizzen

    nizzen Ancient Guru

    Messages:
    2,422
    Likes Received:
    1,159
    GPU:
    3x3090/3060ti/2080t
    50% slower in minimumfps compared to 8700k in cpubound games. That is BAD and actual very slow for it's price point. PS: I had 1700x/1800x/8700k and 9900k at the same time. Gaming performance was sad, compared to 8700k with fast ddr4. But if you compare 1800x with fast memory and 8700k with slow memory, the difference was less. Too bad for Ryzen 1 series that 3200mhz was pretty much the max. 8700k was running with 4700mhz memory :D
     
  4. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I'm saying if you're not using the full CPU for your tasks you're not getting the entire picture

    Find me a game that'll use intel or AMDs maximum amount of cores and thread, then get back to me

    Until then, claiming that a CPU is not inefficient because "no one uses the CPU at its max" is a bogus claim.

    And last i checked this is a hardware forum, not a pure gaming forum, so why you keep saying "play cinebench" is beyond me, 12900k, ks, 5900x, 5950x, and countless other CPUs with many cores main purpose is not gaming, doesn't mean you can't buy one for gaming, but your assertion that people "play cinebench" is just pure nonsense made to disregard the performance benefits and efficiency benefits of CPUs that provide more then what gaming needs.

    If you're not that guy, aka someone who uses their PC for more then just gaming, professional work, stop pretending to be that guy.

    lol...the random percentages and numbers you keep bringing up is amazing.

    Please show your work.

    Seems to me you just want to make claim after claim after claim without anything to actually back you up and hope that everyone just sees what you write and not question it.
     
    Last edited: Mar 18, 2022

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,049
    Likes Received:
    4,432
    GPU:
    Asrock 7700XT
    Ugh.... literally any piece of hardware can be the bottleneck if you cripple something else. Get real here. Most modern CPUs can yield framerates that even 240Hz displays can't keep up with, and in most cases, you don't really gain anything when going higher than 144Hz. You can see a difference, but very few people would be able to actually play better as a result of that difference.
    So ultimately, no, modern CPUs are not practical bottlenecks in gaming. An i3 7100 can play CS:GO at over 250FPS at 1080p ultra settings using a 3070. The performance steadily declines as you worsen the GPU, implying even that mediocre CPU is still not maxed out. Remember, that's a 5-year-old locked CPU with specs that were mediocre even for its time. Games have not evolved that much in CPU requirements. As stated before, your 12900K uses less power because it isn't working that hard, so, you've kinda proven my point how it's unnecessary for gaming purposes. So, if it's overkill for gaming and too power hungry for productivity, then what makes it a good choice?

    A 12600K is all anyone really needs to play just about any modern game.
     
    Ivrogne and Venix like this.
  6. Falkentyne

    Falkentyne Master Guru

    Messages:
    544
    Likes Received:
    79
    GPU:
    Sapphire HD 7970 Ghz Ed.
    A 12900KS seems to be averaging 5.4 ghz in R23 with a 360 AIO. Only golden 12900K's will do that, and a 12900KS is a binned 12900k WITHOUT AVX512 possible. It's not a "new" chip.
    But so far the bins look like +100 to +200 mhz better, or 100mv lower voltage previously. So no, you can't just overclock a 12900k and get a KS, not all K's are even good samples.
    There were weak 9900K's that wouldn't even do 5 ghz on all cores, so the same argument applied there too.

    If you're rich, go ahead and buy a 12900KS no matter what you have. Money isn't so critical to rich people and enjoy messing with it for the fun and E-peen.
    If you're a normal person and don't have a Z690 yet, it makes decent sense to pay the premium for a 12900KS (versus a regular 12900K, especially since good samples are getting binned now) +motherboard and upgrade at once, you'll get a good chip that clocks well or can run at lower frequencies at absurdly low vcore (5 ghz all cores at 1.0v may be possible in R23)
    If you're a normal person and already have a 12900K, wait for a 13900K, get the same good KS bins on the P cores and get double the E cores and possibly a better IMC.
     
  7. Venix

    Venix Ancient Guru

    Messages:
    3,487
    Likes Received:
    1,982
    GPU:
    Rtx 4070 super
    @nizzen I get that you get a lot of hardware to play with tweak and push it to the limits witch is awesome!
    The way you are presenting things comes out and rubs wrong to people , you call Ryzen 1 bad at gaming , yes the 8xxx series from Intel could squeeze more fps and 9th but that does not mean the Ryzen 1 can not game ....

    What you describe about "every game being CPU bound" translates to rainbow six running on 600 instead of 200 fps , sure thing it does mean lower latency but we are talking values of 2-3 milliseconds except if you are 1337 god like mlg player .... Then you can not get advantage of that either . Ultimately you mocking people that play cinebench all day is not wrong the number result has no actual use other some vague comparison. On the other hand playing on 400 fps instead of 200 is also as useless of a resault other than the satisfaction that your tweaking gave you 20 extra fps the practicality of measuring the fps is LITERALLY the same as cinebench !
     
    Ivrogne likes this.
  8. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,585
    Likes Received:
    1,912
    GPU:
    7800 XT Hellhound
    You'd probably drop down to 35fps with 2600X in Watch Dogs Legion at some places. And 250fps in CS:GO? More like 50fps in Danger Zone Sirocco map. The worst cases count much more than useless fairweather benchmarks. Zen 1/+ was just bad for gaming vs. Skylake, and even Zen 2 still required well optimized RAM/IF for some more thread-heavy games. Really can't say nizzen would be wrong...
     
    Airbud likes this.
  9. TLD LARS

    TLD LARS Master Guru

    Messages:
    791
    Likes Received:
    372
    GPU:
    AMD 6900XT
    No.
    1440p
    Apex legends Vsync off = 200-300fps 6900xt 100% load. Ryzen 1700 3700mhz 40-50% load.
    Cyberpunk 2077 70-110fps 6900xt 100% load. Ryzen 1700 3700mhz 40-50% load.

    The more modern the game, the better it spreads CPU load to more cores, so it is the opposite, old games need faster CPU boost clocks because they do not spread the work out, as good as new games do.
    Look at DOOM Eternal, maybe the best engine ever made (until maybe Unreal 5 takes over), but with DOOM just use anything from a 5600 or 12400k and up and it would be difficult to spot any difference in a blind test.
    I tend to just let older games hit 1440p 165fps (monitor max) and enjoy the 700 rpm GPU fan silence instead.

    I am sure the less CPU overhead on a AMD GPU helps me a lot though.
     
    Ivrogne likes this.
  10. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Your guys' memory of Ryzen 1st gen is weirdly misremembered, like...intentionally misremembered.

    But hey, prove me wrong. Or just stop making crazy assertions, one or the other.

    P.S. I have a Ryzen 1600 on a B350 motherboard with dual-channel DDR4 2933 ram and an RTX 3070 and running on a 1440p 144hz monitor, and i can tell you right here and now, factually, you are not going to be able to "prove me wrong", because nothing you've said holds water.

    And before the "Well that's 1440p!" um..that's not how that works. If the numbers are already massively over what you are claiming at 1440p, it'd just get worse for you the lower the resolution..and if you wanna claim the other way, 4K, well, you're digging yourself even more of a hole since the CPU barely matters at that resolution.
     
    Last edited: Mar 19, 2022
    The Reeferman likes this.

  11. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,585
    Likes Received:
    1,912
    GPU:
    7800 XT Hellhound
    Or you are just too lazy to click the 1st YT video when searching for "watch dogs legion ryzen 2600", it runs like crap:


    With RT on, it gets even worse.

    8700K with high RAM clock kills it, no matter how much you don't want it to be true.
     
    nizzen and Airbud like this.
  12. NCC1701D

    NCC1701D Master Guru

    Messages:
    269
    Likes Received:
    172
    GPU:
    RTX 4090 FE
    I wouldn't say I love it. First time building with an AMD CPU and it was decent, but nothing super special. Holding out to see how that 5800x3D is and if they are actually going to produce them in enough quantity to attain. The new Intel series is pretty legit though. I'd consider upgrading to it, but I think I'm going to wring some more life out of this X570 board. Probably go 5900x or 5800X(3D). Happy that Intel got over that 14nm hump though. Lots of good options on both teams right now.
     
  13. Airbud

    Airbud Ancient Guru

    Messages:
    2,607
    Likes Received:
    4,133
    GPU:
    XFX RX 5600XT
    So True...:p

    Bench1.png
     
  14. Krizby

    Krizby Ancient Guru

    Messages:
    3,150
    Likes Received:
    1,840
    GPU:
    Asus RTX 4090 TUF
    Yup, built a PC for my friend back in 2017 with Ryzen 1700 + 32GB RAM + 1080Ti (quad ranks only run max 2666MT/s) and it was slower than my 8700K + 32GB RAM + 1080Ti (quad ranks 3600MT/s easy) by 30-40% in PUBG, even at 1440p.
     
    aufkrawall2 likes this.
  15. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Did you really just pair a 2060 with a 2600 youtube review and then try and claim it's worse with RT?

    Sorry bud, as i said you have a losing battle, i already know what a 1600 with a RTX 3070 can do in watch dogs legion at 1440p and it doesn't come close to your assertions that it's 1st gen ryzen problem.

    Show me a review that isn't reviewing the RTX 2060 while you claim it's the 2600 that is being reviewed. Because yes, a RTX 2060 in legion, especially with Ray tracing, is not that great of a card for watchdogs legion.

    Otherwise lets just pair an Intel 12900KS with an GT 710 in watchdogs legion and lets just claim the 12900KS is the worst CPU for the game ever if that's the way we're going to go about it.
     
    Last edited: Mar 19, 2022
    Venix likes this.

  16. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,585
    Likes Received:
    1,912
    GPU:
    7800 XT Hellhound
    We really don't need to continue talking when you can't make sense of 80% GPU utilization.
     
    Airbud likes this.
  17. Airbud

    Airbud Ancient Guru

    Messages:
    2,607
    Likes Received:
    4,133
    GPU:
    XFX RX 5600XT
    Bottleneck somewhere...?

    In the context of a PC, a bottleneck refers to a component that limits the potential of other hardware due to differences in the maximum capabilities of the two components. A bottleneck isn't necessarily caused by the quality or age of components, but rather their performance.

    :D
     
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    The difference between what you are saying/relying on and what i am saying/experiencing, is the fact you're relying on some youtube video, whereas i have the actual hardware i'm talking about, so, here we go:

    Bare in mind: I do not have a 2600, i have a 1600. As well, i am not overclocking the 1600 in any way. I do not know if the 2600 he has is running 3200mhz memory, though i suspect it would be, i am not.

    I did every test exactly the same settings as he did, but i only did no DLSS/ray tracing and the DLSS/ray tracing options

    No DLSS/Ray tracing from youtube with 2600 OC'd to 4.0Ghz and 2060

    [​IMG]

    No DLSS/Ray tracing, from my PC with 1600, no OC, RTX 3070 and 2933 ram

    [​IMG]

    DLSS/Ray tracing enabled at exactly the same settings as the video

    DLSS/Ray tracing from youtube with 2600 OC'd to 4.0Ghz and 2060

    [​IMG]

    DLSS/Ray tracing, from my PC with 1600, no OC, RTX 3070 and 2933 ram

    [​IMG]

    Now, especially there, not a huge difference in max, average but big difference in minimum, yet is a lower end processor that isn't OC'd and look at that, the GPU isn't loaded 100%.

    Now just so we are clear, i'm not saying ryzen 1000/2000 was the best for gaming, never did i say that. Nor did i say the 8700k was worse off. I very clearly stated that it's not as bad as you guys seem to want to remember it being.

    I see posts and posts about the 8700k having performance issues as well, and pictures showing 100% utilization of the CPU in 1440p as well, Watch Dogs Legion is not really a great example

    Heck, in each of the pictures i provided from my computer except for the last, the GPU useage wasn't 100%, and yet i did test every single DLSS setting and had ZERO difference in FPS, ray tracing on or not, so i'm pretty certain the CPU is bottlenecking it, and yet, still getting better performance then the youtube video you provided, still getting better performance then your original statement.

    And then you threw in there CS:GO and 50FPS on Danger Zone Sirocco, these are on the RTX 3070 and 1600, same as before

    [​IMG]

    [​IMG]

    [​IMG]

    Max CS:GO settings, 1440p. Never once did it get anywhere close to 50fps. Sometimes, rarely, i saw it hit somewhere in the 90s, walked all around, heck i was even running the server.

    Again, and again, making this super clear: At no point am i saying the 1000/2000 were amazing gaming CPUs, or that they were better then the competitors, my 5900x with my 3080 and 3200Mhz ram gets near double this performance in CS:GO. How much of that was the increase in ram speed, RTX 3070 vs 3080, and 1600 vs 5900x, who knows lol

    All i'm saying is it's not as bad as you and nizzen seem to remember it being, especially compared to the competition of its time.

    Just for fun, i did the RTX 3080 and 5900x in watch dogs legion as well, and the results are...interesting. This is compared to the very first two pictures, with no DLSS/Ray tracing, all settings set the same as the youtuber

    [​IMG]

    For one, the max 203 FPS is bogus, that spike..who knows what happened there. But also, 99% GPU load even though much faster CPU.

    Only a 10 FPS increase in minimum, but good increase in average, but how much of that is the CPU vs the GPU, etc.

    The more and more i look around, see others results, the more and more i realize: Watch dogs legion is probably not a good example to ever give for performance differences between CPUs and GPUs, it's not reliable. Heck even the VRAM useage is different on all of them, even though the settings are the same, 5.41 on the youtube 2600/2060, 4.56 on the 1600/3070, and 5.72 on the 5900x/3080... ????
     
    Last edited: Mar 19, 2022
    Venix, Airbud, The Reeferman and 2 others like this.
  19. Ivrogne

    Ivrogne Master Guru

    Messages:
    228
    Likes Received:
    217
    GPU:
    a GPU
    There are dozens of good games available, why are you guys focusing so much on Watchdogs Legion? I don't understand. It have been brought many times, not just in this thread.
     
  20. EspHack

    EspHack Ancient Guru

    Messages:
    2,802
    Likes Received:
    190
    GPU:
    ATI/HD5770/1GB
    I suspect the average KS buyer only needs half a percent uplift to justify his decision
     
Thread Status:
Not open for further replies.

Share This Page