Review: Core i5 10600K and Core i9 10900K processors

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 20, 2020.

  1. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    I really don't want to hurt anyone's feelings, but who the heck on Earth can give money for this product?

    I also absolutely can't understand the recommended badges.

    That +70W power draw with 2 less cores is a shame. It's about 12% slower on average in productivity and only 5% faster than the 3900X in 1440P games (with a 2080Ti). And not to mention price, even without the extra cooling: most people will definitely change the 3900X's stock cooler, but it's there and can cope with the CPU. The 10900K doesn't have any, so you have to buy one, and it should be a watercooling solution regarding the temperatures. And the 10900K is already about $100-130 more expensive than an already better 3900X. It's a huge NO.
     
  2. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    Well, i can only talk about my personnal experience and so far i didn´t have that kind of issues but i also haven´t tried the games you mentioned. Also, i hope you are aware of the difference between the 1060 and 1080Ti, because my card seems to offer twice the performance of your 1060, so maybe the bottleneck is not the 7600K...

    Anyway, let me know how it goes with the 3300X and if the difference is that big.
     
  3. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    a few pictures I made for an "article" I'm writing
    I share them here because I read a lot of "omg 10900k+Z490 are so hot and use so many watts" in reality, yes and no...there's worse out there ;)
    EDIT: my main beef with this release are the Z490 motherboards price, X570/TRX40 were more expensive because nrands had to do the work for the new chipset,vrm,cooling,pcie 4.0 etc..while Z490 is almost copy/pasted while being more expensive than X570/TRX40 wth !

    scores or temperatures are for comparison purposes don't take them for review numbers, watts are on the high side because this is a not a review computer with 3 apps, it's fully loaded PC almost all ports and usbs in use custom loop 10gbit ethernet etc...oh yeah and full RGB with 400+leds overall wich draw quite a bit of current
    upload_2020-5-21_20-31-23.png
    upload_2020-5-21_20-36-53.png
    single 1080ti, SLI would go above 1000Watts for sure
     
    Last edited: May 21, 2020
  4. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    Do you care about powerdraw in scenarios that you are doing, or power draw in theory?

    Try too look @ powerdraw and heat in games or programs that you actually use. No one cares about powerdraw in Prime 95 avx 2

    My 3900x says 105w tdp, but using 148w @ stock settings.
    Overclocked to the "max" on water, it's drawing 350w + :D
     

  5. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Better is subjective in this case.... You can't compare based on power phases, because the supported CPUs are completely different and have different power requirements. Rear IO is user dependent. I only need 5 USB ports on the rear IO, so having 10 doesn't add any additional value for me. However, you may have a use for 10 USB ports on the rear IO, so they would add value for you. There's no evidence that the Z490 boards are of higher quality than their AMD counterparts. Having a 10G, 5G or 2.5G NIC doesn't add value for me either, while it may for you. Having a "Killer" NIC actually decreases value for me, whereas it may have no effect or increase value for you.

    Which is better all boils down to what the user wants/needs and their particular usage. For you, Z490 may be "better" but that's not the case for all users.
     
  6. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    I'm really happy with this review but bored!
    Theres no reason for me to upgrade my clocked 6700K, gaming at UHD60 and 1440p120.
    Looks like It will be at least 6 years old before they prise it out of my clammy digits.

    @Hilbert
    I noticed a possible error:
    https://www.guru3d.com/articles-pages/intel-core-i5-10600k-processor-review,2.html
    Under TDP and PL States, you listed 10500K and no 10600K, shouldnt this be 10600K?
    No mention of a 10500K is made anywhere else.
     
    -Tj- likes this.
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    I dont understand the value of your post.
    You posted power use without using a mains meter, including the use of 400+ RGB LEDs and other unconfirmed loads using AMD kit, in an Intel CPU review thread.
    I'm not sure it would be much use in an AMD thread.
     
    chispy and sykozis like this.
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Ye me neither with this oc'ed 4770k, and i still manage 41ns latency vs these with 47



    I'll just wait for ddr5, won't be long now :D
     
    Mufflore likes this.
  9. Margalus

    Margalus Master Guru

    Messages:
    388
    Likes Received:
    83
    GPU:
    MSI Ventus 3060 Ti
    The 10900k rated at 125W hits 235W, the 3900 rated at 105W hits 220W. Which one has the biggest difference in power?
     
  10. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    [​IMG]
     

  11. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    A 10 core cpu drawing more power than a 24core 2nd gen threadripper is a big yikes.
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Well, neither company states TDP as power draw..... The Core i9 10900K is listed as having a 125W TDP. Per Intel:
    Intel doesn't actually give a power consumption spec for any of their processors, nor does AMD.

    Go here: https://ark.intel.com/content/www/u...0900k-processor-20m-cache-up-to-5-30-ghz.html
    Click on the little blue question mark next to TDP if you need proof...

    TDP is used for designing thermal solutions.
     
  13. Margalus

    Margalus Master Guru

    Messages:
    388
    Likes Received:
    83
    GPU:
    MSI Ventus 3060 Ti
    I guess that means a 12 core 3rd generation Ryzen drawing more power than a 24 core 2nd gen threadripper is a big yikes also?

    The only problem is, I don't know how much power the threadripper actually draws. I haven't been able to find any results of the cpu power draw for that particular cpu. But if the 10900 draws more then the 3900 will also since it's only 15W behind.
     
  14. Margalus

    Margalus Master Guru

    Messages:
    388
    Likes Received:
    83
    GPU:
    MSI Ventus 3060 Ti
    I'm not arguing about that tdp means power draw or what it's used for. Simply that AMD is just as far off as intel on tdp vs max power usage. Everybody is making a big deal of the 10900 drawing 235W max when the 3900 get almost the same number at max also.
     
  15. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    Im looking at the multithreaded load power graph, thee 3900x (227w) draws less power than the 2970wx(250w) the 10900k draws 295w , that is not a 15w difference, that is a 68w difference(3900x vs 10900k), a big yikes
     
    Last edited: May 22, 2020

  16. Margalus

    Margalus Master Guru

    Messages:
    388
    Likes Received:
    83
    GPU:
    MSI Ventus 3060 Ti
    But that is total system power, not cpu power. As they said in the article, every system will be different. I am talking specifically about cpu power alone, nothing else.
     
  17. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    I dont think the vrms are drawing an extra 68watts, variation will be <10-15watts between boards in most cases, and the amd 3000 chips are at a disadvantage since they are being tested on hungry x570 boards ,

    plus realistically i could care less if the cpu draws X amount of watts, wattage at the wall is what actually matters, since its what you actually pay for (air conditioning included)
     
    HandR likes this.
  18. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    People are incorrectly using TDP to determine what the power draw should be. TDP was never intended to determine power consumption. It's strictly for guidance on cooler design. Neither processor is working outside of it's intended power envelope, unless the system is specific configured to do so. If the system is running a stock configuration, the processor will stay within it's intended power envelope. Intel does not publicly state what that power envelope is, nor does AMD.

    Quite frankly, I don't see why people are getting so worked up over power consumption of a desktop part. Unless you're running the system at full load, 24/7, the power consumption is negligible. If power cost is such a major issue, building a high-end system should be the last thing you plan to do.
     
    Fender178 likes this.
  19. alanm

    alanm Ancient Guru

    Messages:
    12,269
    Likes Received:
    4,472
    GPU:
    RTX 4080
    Higher power draw usually means you need higher grade, pricier cooling solutions to keep temps down.
     
  20. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    You got that right. Even Tech YouTubers say that most users interpreting TDP wrong. I totally agree if you are building a high end system and worried about power cost being an issue then you think about it some more. Yeah I don't get it either why users are worrying about high power consumption of a desktop CPU. It makes no sense if you are building a super high end build.
     

Share This Page