Core i9-9900K and i7-9700K turbo clocks revealed - 4.7 GHz with all eight cores

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 11, 2018.

  1. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    I don´t think that´s going to happen. People buy Ryzen because of the amazing value that offers, something that Intel can´t match, and there´s very little from Intel to tempt them back. The only ones who might second guess their Ryzen purchase are the ones who bought an octa core because of gaming. Other than that i see no reason for Ryzen owners to scratch their heads...
     
    D3M1G0D likes this.
  2. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    bulldozer/vishera architecture are 8 core yet were slower than intel quad core...

    same with intel 6 core 8700k and now 9900k/ gonna be the same/ AMD will be slower...Watch and see.
     
  3. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Oh, you mean synthetic benchmarks like 720p gaming results? (you know, the results that nobody will ever fine useful and in no way predicts future performance? :p)

    But honestly, I'd be tempted to buy a 9900K if I still only had a quad-core CPU. An 8-core CPU was what I most wanted last year, and if the 9900K had been available around the time of Summit Ridge's release (and for a reasonable price), then I might still be in Intel's camp. The Intel crowd can go nuts over them (and they indeed look to be a great CPUs) but it's too late for me - Ryzen already fulfilled my 8-core need, and then some. Hell, I couldn't even buy this if I wanted to, as I'm still trying to figure out a way to buy a 2990WX ;).
     
    airbud7 likes this.
  4. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    The fact that they felt the need to rename the i5 to i7 and the i7 to i9 leaves me feeling that these CPUs will have some serious price hikes. Betting on upper 300s for the 9700 and mid 400s for the 9900.
     
    xIcarus likes this.

  5. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    Bulldozer cores were not actually quad core. They were a pair of dual interger modules with a shared FPU between each pair, and a slow interconnect between the pairs. Ryzen is a true quad core product, and Zen2 will likely see IPCs match the current Intel Core architecture that they have been milking for a decade doing basically nothing but slowly creeping up clock speeds and making incremental iGPU improvements to.
     
  6. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    Nooo!....The 1440P gaming results with a 1080ti where all processors are equal :p
     
  7. scoutingwraith

    scoutingwraith Guest

    Messages:
    9,444
    Likes Received:
    9
    GPU:
    Tuf 3070Ti / P1000
    Seriously makes me think if i should get one of the Xeons that my job is donating and build around it. I am stuck in this thought on whether its worth upgrading right now....
     
  8. Irenicus

    Irenicus Master Guru

    Messages:
    619
    Likes Received:
    116
    GPU:
    1070Ti OC
    Sure! Intel will just go out of business. LMAO Some people are clueless. You're one of them!

    In what world? 8600k here, with AIR cooling, OC'd to 4.8ghz. Never goes over 75 and that's rare. Usually 60C while gaming.
     
  9. shamus21

    shamus21 Member Guru

    Messages:
    144
    Likes Received:
    25
    GPU:
    0
    well I hope they can achieve those numbers. Intel need to get their act together.
     
  10. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    I don't doubt Intel can manage to get high speeds, the question should be "how much cooling would be needed".
     

  11. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,304
    GPU:
    Mi25/IGP
    I'd wager it will be similar to the 7820x for power consumption(when not tdp throttled), thermals would be similar to a delided 7820x assuming they are actually going to use a soldered ihs,
    There are no magic beans, This is an 8700k with 2 more cores and slightly higher clocks. my guess is really close to 200W power consumption under max load with no tdp limit.
     
    Embra likes this.
  12. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
  13. moeppel

    moeppel Guest

    Messages:
    153
    Likes Received:
    23
    GPU:
    1080 Ti
    I wonder if it's appropriate to call people clueless based on interpreted assumptions that were not spelled out directly nor indirectly. Hyperboling also seldom helps.

    If it's not too much of a stretch for you to make, could it potentially be that Intel's Core-Architecture has reached its end where the 9900K is its last hurrah?

    I don't know what kind of upbringing, if any, you've enjoyed but where I come from people treat each other with some sort of basic respect. If this is the level of basic respect you've towards others/stranger, may that be in RL or on the Internet I strongly suggest you work on yourself before calling others names.

    There your efforts will be better spent anyway, rather than purposely and hyperbolically misinterpret things to feed some personal form of complexes. Being a toxic internet warrior, especially on a hardware forum of all things, shouldn't be how someone spends his life. Ultimately that decision is for you to make, however.
     
    Last edited: Aug 13, 2018
  14. Warrax

    Warrax Member Guru

    Messages:
    142
    Likes Received:
    30
    GPU:
    GTX 1070Ti
    Why are you surprised? IPC hasn't increased much since then, and the 8000 and 9000 series are still based on Skylake (6000 series) but with more cores and higher MHZ potential, you don't need to upgrade unless you need the extra cores.

    Eh...if you consider the slightly inferior IPC on Ryzen side and the ~500mhz difference between the 2 processors, there's no way the 2700X will win over the 9900k.

    What you should be looking for is 3700X vs 9900K if rumors about Zen2 are true (higher frequency)
     
  15. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Depends on what type of player you are.

    I still play on 1080p and aim for a constant 120 fps. Add in some more CPU bound games and that IPC per core starts to really matter. From all the titles from the past few years, DOOM 2016 is probably the only one I run maxed out (that engine does output some crazy framerates for the detail shown on screen though :x).

    And then there's still those games that don't take advantage of multicore processing that well.

    Honestly I applaud AMD offering some competition, but when I buy a new CPU, I look at performance that fits my budget and my budget is usually the "more reasonable high end", aka I won't buy a € 1000 CPU, but I will spend € 500 on one and OC it. More often than not, Intel still is best at that category.
     

  16. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    That is very narrow minded vision.

    We talking about numbers where GPU does NOT matter. With 1080/1440 GPU can easily hit a cap with poorly optimized engine, so you end up benchmarking GPU and not CPU.
    Games engines not gonna drastically change within 5 years. Especially those that are not in active development. So having results where CPU actually capped is helpful. Especially if user updating GPUs more frequently than CPUs.

    Say Nvidia release monster GPU that eliminates GPU cap, suddenly everybody become CPU bound.
    Or user just prefer to play on low setting for competitive or fast paced games with high refresh panel. So again, we eliminate GPU cap and CPU becomes an issue.

    So speak for yourself regarding 720p results and not for everyone.
     
  17. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Do you not remember the real-time ray-tracing demos that were shown a few months ago? They require vast amounts of computing power and will probably take at least a couple of generations before consumer GPUs are capable of them (the demos that were shown were using multiple Volta GPUs, and were only rendering specific scenes). As more powerful GPUs are released, game developers will find ways to make use of them and there is every reason to believe that game engines of the future will be a lot more demanding.

    There was also a couple of articles on HardOCP a few months ago where modern and older games were tested with max IQ settings and even the 1080 Ti was dropping below 60 FPS @ 2.5K on some settings (and getting obliterated at 4K). It goes to show that there's a lot more IQ to be unlocked if we only had more GPU power.

    There was a review just the other day about a 4K/144hz monitor. Even if the 1180 was twice as powerful as the 1080 Ti it still wouldn't be enough to drive this kind of monitor at full FPS at max resolution and details in most games, and there is every reason to believe that monitor technology will continue to improve. My guess is that five years from now we will have 8K monitors which will bring even a GTX 1580 Ti to its knees and 4K will be mainstream or low-resolution (like 1080p is now).

    The idea that gamers of the future will play with a CPU cap is unrealistic, and assumes that graphics engines and monitor standards will stay more-or-less the same while GPU power increases by an order of magnitude. One can easily imagine the opposite scenario where Nvidia, due to lack of competition, releases mediocre updates year after year and enthusiasts are stuck with 4K for the next decade. The actual reality is probably somewhere in the middle, where advances in GPU power will be accompanied by advanced in game engines and monitor standards. This is why I see CPU-capped gaming benchmarks as synthetic.
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    When CPU has to push just one texel on the screen, it will show difference in CPU. One will do 230 thousands of frames per second and other one 190 thousands.

    Here people are calling those 720p results as gospel because they think that one day they (and everyone else) are going to buy that 480Hz screen and $3000 Titan card. And they think that at that time they will still have this or that CPU... dreams.
    In reality they should think with their current situation and since even intel is giving more cores, they should actually expect game engines to take advantages of more cores.

    With similar IPC (some actions are faster on AMD's side, some on intel's) and intel's 10% clock advantage on most of their CPUs (Yes, most of CPUs they sell are locked), one should think about performance per $ and maybe even performance per Watt.
    With performance per $, it is unlikely that there will be even competition in 8C/16T, intel will not drop those chips to $350.

    If anyone here wants to throw cost reality into garbage then he should bring into argument those masses which paid $1000 for server/workstation 8C/16T from intel which would do likely better due to more PCIe lanes and memory channels. No, there are like 2 users like that on this forum, but outside in general population...

    Sound arguments are those which apply to large percentage of people and real world scenario. Because till now, all those unrealistic scenarios did not become real.
    And arguments based on them were "just in case" at best.
    = = = =

    Now to the sound arguments. Everyone who knows thing or two expects that AMD will deliver reasonable IPC improvements with Zen2 over Zen1 as it is new architecture. And it is expected that here will be quite nice clock uplift.
    Everyone saw that intel was forced (call it whatever you want) to give 6C/12T and then 8C/16T.
    => With that AMD's Zen2 solution will be dancing around intel's solution i9-9x00(k) solution in terms of performance.

    Who here thinks intel will not deliver that long time IPC bump which did not happen till now? Who here thinks that at this point in time it is wise to buy that expensive i9-9900K, and that its per core performance will not be overshadowed within 12~18months in same ration as is between AMD and intel now?

    I do not think intel will go and beat AMD in core count race, their design is not cheap enough, their operations are not lean enough to cut those margins. They have to counter with improving IPC. And While 8C/16T from AMD shown that it is sound solution and consumer market loves this core count, there is no big need for more.
    This brings me to new IPC race between AMD and intel. (And maybe even need to upgrade more often for those poorly coded games... j/k => But some will feel the pressure to upgrade much more often.)
    = = = =
    I personally wonder if I am going to upgrade to Zen2 or not. And what will it cost me. But on inte's side I am quite sure price of upgrade will be quite bigger.
     
  19. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Why you keep talking about 4K? Is 4K mainstream now? Will it be mainstream in 5 years? I don't know, but 1080p is still by far the most popular resolution among gamers right now.
    I also don't pay attention to whatever demos nvidia shows, I play actual games, like 99+% of gamers do, instead of tech demos.
    You also keep skipping the user preferences and gamers who game on 4K are less than 1%. I also talk for majority, which are not running games on ULTRA settings.

    Welcome back to boring reality.

    edit: I don't see how this conversation concludes. If you don't find CPU capped benchmark useless and prefer seeing GPU capped results of CPU benchmark, it's up to you.
    I want to see CPU full potential, which matters for now and will be for 5 years. And will matter especially once user decide to upgrade GPU and keep running on 1080/1440 resolution without sliding all video settings to ULTRA just cause those exists.

    Game developers are not casting magic to optimize engines. Utilizing newer CPU does take time and technology doesn't change that fast (unfortunately).
     
    Last edited: Aug 14, 2018
    basco likes this.
  20. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    I was thinking the exact same thing. Reminds me of everyone going ape $#it over AMD growing from 0.2% market share in the data center arena to 1.2%, with all the chest pounding and grandstanding while claiming Intel is doomed.

    See.. if people would actually spend more than 5 seconds reading a headline and repeating it as fact, they wouldn't look stupid making claims like "Ryzen beats Coffee Lake in market share for July" . Those misleading claims came directly from AMD's Division of Lying to Consumers (aka marketing) where a German retailer which primarily sells AMD processors ended up selling more Ryzen chips than Intel chips.

    Another fun fact, Steam Hardware Survey (which AMD fans proclaimed to be much more accurate since they made changes to fix oversampling from internet cafes) is showing that AMD has actually lost marketshare in June and again in July.

    Thats because he's comparing AMD chips using custom cooling to Intel chips using a stock cooler that the K series doesn't even come with. Nobody living in reality is going to be using an intel stock cooler on an 8600k, 8700k, etc.

    While i'm holding off till the new intel chips are released, if i had to buy one today my choices would be an i7-8700k for $319, i5-8600k for $229, or a 2700x for $289. The intel chips both have an additional $30 off if bundled with a motherboard. So yea, the choice would be crystal clear.
     

Share This Page