Intel Core i7-10700K Spotted in 3Dmark listing at 5.30 GHz Turbo Boost

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 12, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,531
    Likes Received:
    18,841
    GPU:
    AMD | NVIDIA
    Deleted member 213629 likes this.
  2. Xserces

    Xserces New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    MSI gtx 970
    TDP 0 (Meltdown ERROR... ERROR... Kaboom) :D
     
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    ^ Yeah... until we know what cooling it had, I wouldn't trust that benchmark to not be made with LN2 or a chiller... it's not like we haven't had that already.
     
  4. EspHack

    EspHack Ancient Guru

    Messages:
    2,799
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    maybe they realized they might need it
     

  5. Glottiz

    Glottiz Ancient Guru

    Messages:
    1,949
    Likes Received:
    1,171
    GPU:
    TUF 3080 OC
    When AMD release a product with higher TDP than competition: "It's not really an issue, it costs just a few extra Euros pear year to run it, difference is negligible. I still love AMD and support them!"

    When Intel release a product with higher TDP than competition: "OMG Intel is the devil incarnate! They killed my whole family!"
     
    kakiharaFRS likes this.
  6. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    Why? Intels have boosted to 5ghz right out of the box for a while now. 5.3ghz is not a miraculous increase. On LN2 they've hit well over 7ghz.
     
  7. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    amd generate a crap ton of heat and unlike Intel, as in the laptop segment, the idle power is super high meaning your beloved AMDs are way hotter than an overclocked Intel
    I know I own a 9900K@5.1Ghz and a TR 3960x@4.15Ghz yep it's lower than stock but otherwise the idle temps are way too high, enough to heat a living room when it's 0°C outside
     
    Last edited: Feb 12, 2020
  8. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Fairly big difference there.

    When AMD produces a high TDP part, it typically actually produces that wattage plus or minute 10 watts.

    When Intel produces a high TDP part, it's actual wattage is around double that full load.

    I don't think people would be as upset with Intel in regards to how much power their parts use if their TDP calculation was a bit more realistic to its actual wattage usage. Especially when it comes to their top of the line consumer parts.


    Did you really just compare an 8 core part to a 24 core part and complain that the 24 core part is warmer then the 8 core part?

    ..............

    Do you work for intel or something? Serious question. I can't fathom how someone would try to compare incomparable parts to eachother unless they work for one side or the other trying to save face and spread misinformation.

    That or you really weren't prepared to get such a large, multi-core CPU and have completely unrealistic expectations.
     
    Last edited: Feb 12, 2020
    theoneofgod and carnivore like this.
  9. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Good try but people griped about both especially Vega so no AMD is not immune to criticism. Intels power numbers are rather over the top these days for 8-cores or higher in desktop chips. The PL2 on this chip is going to be like 225-250 watts.

    Most people talk about TDP's but frankly for the average person the TDP numbers are meaningless unless you are an engineer designing a specific cooling solution. The PL1 and PL2 states tell how much power is going in and is a better indicator of how much heat you will have to dissipate. These high boosts are really going to be near 250 watts which means to hit these boost clocks for any amount of time you have to have darn good cooling. To compare the PL2 on a Ryzen 3800x with PBO on is about 100 watts.
     
    Last edited: Feb 12, 2020
  10. Kool64

    Kool64 Ancient Guru

    Messages:
    1,662
    Likes Received:
    788
    GPU:
    Gigabyte 4070
    this will probably be the gaming sweet spot. If they can slot it into the $400 spot that would be a pretty good deal.
     

  11. kendoka15

    kendoka15 Member Guru

    Messages:
    136
    Likes Received:
    17
    GPU:
    EVGA RTX 3080 FTW3
    Imagine taking Intel TDP figures seriously, lmao
     
  12. Ridiric

    Ridiric Guest

    Messages:
    199
    Likes Received:
    113
    GPU:
    RTX 2080Ti @2100Mhz

    Do any of you actually look at Hilberts power consumption charts?

    Both Intel *AND* AMD's listed TDP's are completely off the mark compared to actual power draw, and are actually pretty similar for their 8 core parts.

    For example, from Hilberts own power tests,
    AMD 3700X: 70W~ system idle - 100W~ single thread - 165W~ multithread (AMD lists as 65W TDP)
    Intel 9900K: 59W system idle - 89W single thread - 198W multithread (intel lists as a 95W TDP)
    AMD 3800X: 64W system idle - 104W single thread - 194W multithread (AMD lists as a 105W TDP)
    Intel 9900KS: 58W system idle - 88W single thread - 241W multithread (intel lists as 127W TDP)

    Mind you these are total system power, so motherboard and other parts are adding extra power load, even with that, its still well over their advertised TDP.

    Both of them dont use actual realistic numbers for their TDP.
    AMD certainly ISN'T within 10 watts of their advertised TDP, neither is intel.

    Not saying that this new intel processor isnt going to use crazy amounts of power, especially considering that the 9900KS already hits 241W on max load, however intel still seems to have the advantage on idle power and single thread power, and when not being fully stressed its pretty comparable to AMD parts with similar core counts that are also not being fully stressed when looking around online for different usage loads.
     
    Last edited: Feb 13, 2020
    Glottiz likes this.
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    TDP is not Power Draw.
     
    fantaskarsef likes this.
  14. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Hilberts tests are total system power draw last i checked unless he has changed his methedology.

    Example from the 9900ks review:

    "We show energy consumption based on a the entire PC (motherboard / processor / graphics card / memory / SSD). This number depends and will vary per motherboard (added ICs / controllers / wifi / Bluetooth) and PSU (efficiency). Keep in mind that we measure the ENTIRE PC, not just the processor's power consumption. Your average PC can differ from our numbers if you add optical drives, HDDs, soundcards etc."

    So all of the examples you gave of the total sytem power useage are irrelevant for the cpu power consumption compared to TDP

    Whereas when you try and find reviewera that try and measure just the cpu package power draw, you get stuff like this

    [​IMG]

    I didnt say either of them use realistic numbers from tdp, i said amd is typically, yes, i did say typically, look up at my original post---- typically, amd is within 10 watts of their TDP rating, and that is true, factual information, whereas intel in their consumer high end, is the last few years typically far beyond it.
     
    Last edited: Feb 13, 2020
  15. Hyderz

    Hyderz Member Guru

    Messages:
    171
    Likes Received:
    43
    GPU:
    RTX 3090
    0nm thats rather impressive :)
     

  16. Ridiric

    Ridiric Guest

    Messages:
    199
    Likes Received:
    113
    GPU:
    RTX 2080Ti @2100Mhz
    The simple fact that the 3800X and 9900K have pretty much identical power results on Hilberts test shows that what you are saying ISN'T true, it doesnt matter that its total system power (which i mentioned in my post), they are both hitting the same power draw amounts, the only major differences between both test benches is the CPU used and Motherboard used, and i can guarantee the motherboard isn't pulling 90+ watts.

    Besides all that, you dont use a CPU without anything else, so total system power draw is still relevant.

    The fact of the matter is, as it stands right now, a 3800X or a 9900K, both understate their powerdraw by quite a bit more than 10W, and they are both functionally the same when used in an actual system as far as power draw goes. And AMD isnt typcally accurate to within 10W on their desktop parts, at least not on their current generation of chips.

    Even on that chart you linked, you could say intel are typically within 10W of their listed TDP and are often UNDER their TDP, the only outliers on that list that significantly go over the listed TDP is the 8700K and the 9900K, and that list doesn't show any of AMD's current generation, which is what i was looking at in my post.

    The whole point is currently, BOTH AMD and Intel don't accurately show TDP, regardless of how well they may have done at showing it in the past.

    And none of that changes the fact that a system with a 3800X and one with a 9900K will functionally draw the same amount of power from the wall.

    Also as a side note of why i find total power draw more useful than the tests where they try and figure out actual cpu power draw, is that depending on how they do it, it can be quite inaccurate,
    like their 8700k result, my total system power draw on my 8700k when i did a test boot with a Hyper 212 Evo 100% 2x 120mm fans on it just to make sure things were working as expected before i installed it into my actual system was around 150W under stress test.

    So yeah, it was boosting as high as it could go with stock settings, had a SATA 1TB SSD, 2 120mm fans at 100%, 16gb of memory and the motherboard all at 150W, and they are showing 145W which is suppose to just be the cpu by itself?
    Either i got super lucky and my CPU is waaaaay more power efficient than usual or their test got suspect results (i mean it could go either way, i haven't tested more than the one 8700k).
    And yes, i benchmark all the systems i build first like that to make sure everything is working 100% before i spend time putting it in a case and setting everything else up just to find out there is an issue.
     
    Last edited: Feb 13, 2020
  17. Ridiric

    Ridiric Guest

    Messages:
    199
    Likes Received:
    113
    GPU:
    RTX 2080Ti @2100Mhz
    Yes i know, its some magical number, Gamers Nexus have done quite a few videos and articles breaking it down, cant remember who it was, but one of them doesn't even use actual power draw in the calculations to determine TDP.
    But as long as people keep using it as some kind of guide to what can be expected for actual power draw i thought i might as well post hilberts ACTUAL power draw figures.

    Hell i don't know why either brand doesn't just use actual power draw for their TDP number, as wattage in generally corresponds almost 1:1 to waste heat out, it would certainly make it easier for everyone.

    I mean if they wanted to make it even more useful, you could have Peak TDP, Average TDP and Idle TDP, all taken from actual power draw numbers, and maybe give an average of that as the default TDP, but make the other numbers available and listed on the specsheet.
     
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    its not magical numbers either, its rating based on heat generation and dissipation requirements.
     
  19. Ridiric

    Ridiric Guest

    Messages:
    199
    Likes Received:
    113
    GPU:
    RTX 2080Ti @2100Mhz
    Watch gamers nexus videos on it, they are basically magical numbers, as i said, wattage in is basically 1:1 to waste heat out, if you put 90W into something like a cpu, you get 90W heat out, the only time that changes is if the energy put in is being transformed into something else like movement or light (and depending on how the light is being generated it can also have mostly heat as an output).
     
    Last edited: Feb 13, 2020
  20. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    If you're going to argue against factual information, be my guest, but i'm not going to continue this pointless discussion about facts not being facts.
     

Share This Page