Intel Core i9-11900K processor review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 30, 2021.

  1. vestibule

    vestibule Master Guru

    Messages:
    445
    Likes Received:
    121
    GPU:
    GTX1070 Zotac mini
    They may well be cheap, butt?
     
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    It is actually kind of achievement. So much current through chip while they can still honor standard warranty.

    Back in the day with Bulldozer, I would be inclined to think that their 5GHz meltdown version were just that. Something bound to degrade.
    Those intel chips manage to survive ordeal which can be summarized as "works as designed".

    While intel's Fab engineers did not reach power efficiency targets in a while. Nor did they manage to keep clocks on shrinked processes at 1st. But they managed to make pretty resilient manufacturing process.
    As you are making normalized 3.5GHz tests for IPC, would it be possible to add 3.5GHz test which loads all cores and get "score" and "power draw"?
    That could give some standardized power efficiency for those who seek it. And may show differences between each generation/manufacturer.

    (I would personally prefer such testing at 4GHz which is closer to standard operation clock, but 3.5GHz is OK too, as it is extra work.)
     
    Last edited: Mar 31, 2021
  3. SesioNLive

    SesioNLive Member

    Messages:
    17
    Likes Received:
    8
    GPU:
    MSI GTX 1080
    While I'm nobody to get attention here I do agree that @hilbert is obviously right on both points, meaning it's his site with his articles and how people put words together can get annoying.

    BUT I do have to agree with some users that the guru3d badges "system" (if you can call it so) would indeed highly appreciate an overhaul with some additional badges etc.

    I've also noticed on some product reviews here that the badge seems not fitting and just confusing.
     
  4. kapu

    kapu Ancient Guru

    Messages:
    4,748
    Likes Received:
    426
    GPU:
    Radeon 6800
    You seem very moved , means there is something on the subject :) It's not just me. Channel that anger on product that deserves good bashing.

    [​IMG]
     

  5. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,823
    Likes Received:
    889
    GPU:
    107001070
    what HH said is true,talking to you is like having an argument with a middleschooler.

    you gotta understand this is not a YT comment section and we like it that way.
     
    Last edited: Mar 31, 2021
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Rating systems are complex. One I would really appreciate would be one which would give some value to people like me, but nothing to others.
    As such, I do not mind badges being representation of views. With views some agree some don't. Yet informational value of "badge" is supposed to be generalized and low in contrast to sheer volume of information provided in entire testing article.
     
  7. tty8k

    tty8k Master Guru

    Messages:
    437
    Likes Received:
    114
    GPU:
    Ati 5850
    We all have opinions based on personal experience, more or less biased.
    But you just turn into a red monkey with all the comments against Intel/ Nvidia products.

    A "crap" product is something that's either broken or doesn't deliver what's on specs.
    This processor does deliver the performance it's just the retail/shop price that makes it a poor choice vs the rest.
     
    cucaulay malkin likes this.
  8. bnauk

    bnauk Member

    Messages:
    10
    Likes Received:
    6
    GPU:
    RTX 3080
    I love Hilbert's reviews, one of the first sites I always go to for them. Recommended can be read different ways, certainly recommended if you're an intel fan who has deep pockets and can handle the heat - but I feel the message going out to people really has to be "It's fine, but if you can buy a 5900x, do this". I imagine reading the review would give you that sense though.
     
  9. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    41,121
    Likes Received:
    9,375
    GPU:
    AMD | NVIDIA
    Hmm, I'd have to think about that. Normalizing frequency says something about the actual IPC, the performance of the architecture, but little about energy efficiency. You can measure it; you'll get the nominal wattage needed for 3500 MHz on all cores, however, in the end, that IPC x frequency and voltage is your performance and relevant energy consumption. And in the end that is what really matters.
     
    Fox2232 likes this.
  10. tty8k

    tty8k Master Guru

    Messages:
    437
    Likes Received:
    114
    GPU:
    Ati 5850
    The max wattage in prime extreme test is not real world usage realistic.
    It was the same with 10700k, websites bashing it for 270W consumption and spending tons on custom water cooling.

    Guess what?

    I have a 10700k in my house overclocked to 5GHz.
    In gaming it avg 60-80W with 130W peaks.
    In blender/cinebench avx 190W.

    Peak temp on blender 80C with a noctua D15.

    Edit: I know this is an optimized scenario with CPU tuning but still.
    Also, the motherboard manufacturers should have a second look at those auto values for Intel, many of them are pumping too high numbers for stock cpu.
     

  11. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,890
    Likes Received:
    1,208
    GPU:
    2070 Super
    Maybe he wandered into News section thinking it's AMD subforum.

    And if you're gonna post anything good about competition in AMD subforum, I believe you have to have a court order, yes?
     
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Then it is good that some have it. Because nVidia's news in "AMD specific news/rumor" thread were not bashed in past. I have seen quite a few discussions there.
    It is about tone people post with. Even I did recommend nVidia's GPU to some members in AMD section. Just because their use case did not have great product on red side.

    But some people go there and on 1st glimpse they make troll posts.
     
    cucaulay malkin likes this.
  13. bnauk

    bnauk Member

    Messages:
    10
    Likes Received:
    6
    GPU:
    RTX 3080
    Doesn't really change my viewpoint. The 5900x is better, more efficient, more future proof and the same price (well it was for me anyway)
     
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,890
    Likes Received:
    1,208
    GPU:
    2070 Super
    Not going to argue any further because its completly off topic. But I won't let this fly either:

    /done here
     
  15. tty8k

    tty8k Master Guru

    Messages:
    437
    Likes Received:
    114
    GPU:
    Ati 5850
    And I agree with you but that doesn't make this product crap.
    In most cases all ends up on a price/perf ratio.

    Is 10700k a better chip than 11700k or 5800x?
    No.
    Does that means it's crap?
    No.
    Why people buy it vs a 5800x?
    Because it's only 300 bucks.

    Change the shop price for 11700k to 350 and we'll have a different view.
     

  16. David3k

    David3k Member Guru

    Messages:
    107
    Likes Received:
    26
    GPU:
    Graphics Processing Unit
    I wrote that back in November 2019 when we all still assumed that Rocket Lack was coming out in 2020, and I guess we all know now where the perf-per-watt landed.

    Based on what I knew and lessons from history, it was almost a given to me that Rocket Lake was going to consume insane amounts of power. What I didn't know was that it was going to come with little to no performance advantage vs the 10900k. I mean I never thought they wouldn't at least come out on top in terms of pure performance, considering the efficiency they sacrificed to get here. Rocket Lake was supposed to have an IPC increase to match Zen2 or Zen3 but with a clockspeed that would make them pale in comparison.

    How then, did this happen??

    At any rate, here's to hoping that Alder Lake will bring us back to performance parity with AMD, and/or future ARM desktop CPUs have hardware x86/x64-to-AArch64 translation layers to accelerate x86/x64 code translation/execution on ARM CPUs so we can start migrating to greener pastures.
     
  17. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,823
    Likes Received:
    889
    GPU:
    107001070
    exactly.
    this is the problem.
     
  18. Ricepudding

    Ricepudding Master Guru

    Messages:
    785
    Likes Received:
    226
    GPU:
    RTX 3090FE
    Honestly with DDR5 and PCIE5 right around the corner, are people actually looking at these CPUs? rumoured to be out with the next generation this just feels like a horrible generation to jump on.

    If you have to have a PC right now this second, i'd get something cheaper that way you can sell it and get some money back for a new one coming out chances are end of the year/start of next
     
    BlindBison likes this.
  19. BlindBison

    BlindBison Master Guru

    Messages:
    678
    Likes Received:
    130
    GPU:
    RTX 2080 Super
    Is it just me or is the 11900K/this lineup even worse than Bulldozer? Even Bulldozer didn't go "backwards" in performance to my recollection and had a niche in very multithreaded workloads, no?

    As it stands, I have no idea why you would get the 11900K over the 10900K -- in the tests I've seen, the 10900K typically performs the same or better than the 11900K.

    The only chip from the lineup that seems even remotely reasonable is the 11600K since it more often beat the 10600K (if only by a very very small margin), but even that performed worse than its predecessor in some cases.

    Or am I missing something? If Intel undercuts AMD's 5000 line by a decent amount/if the price is right I can see people still opting to grab one, but even then I'd expect the 10th gen to be even less expensive for the same performance. Like, we knew 11th gen would be a stopgap/not a huge leap forward and all that, but I didn't expect it to be this bad.
     
  20. BlindBison

    BlindBison Master Guru

    Messages:
    678
    Likes Received:
    130
    GPU:
    RTX 2080 Super
    But they didn't get back gaming -- Ryzen 5000 still holds the high end and slightly outperforms the 11th gen in most cases as per the Gamer's Nexus/HU tests for example (5900X was most typically at the top of the stack/sometimes the 10900K).

    If they compete on price then you'll get into the same ballpark while winning "some" games, but if you're going for price 10th gen will probably be a better value since it's almost exactly the same if not better performance in most games.

    This gen doesn't even seem like a stop gap to me, it seems like it's a regression as often as its a slight improvement that wasn't worth the engineering time. I do hear you though on next gen, once Intel gets out actual next gen 10 nm desktop parts we'll get some meaningful gains ... I hope. I agree with you on the lower end part like 11600K -- that one does seem like the best of the bad situation.
     

Share This Page