Core i9-10900K can boost to 5.3 GHz, more specifications of 10th Gen Core Comet Lake-S leak

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 28, 2019.

  1. Angantyr

    Angantyr Master Guru

    Messages:
    775
    Likes Received:
    232
    GPU:
    MSI 2070 Super X
    Keeping this CPU properly cooled is probably not gonna be an easy ordeal...

    Wonder if Intel will still be plagued by an onslaught of discovered vulnerabilities once this hits the market.
     
  2. squalles

    squalles Master Guru

    Messages:
    833
    Likes Received:
    36
    GPU:
    Galax GTX 1080 EXOC OC
    no, about productivity on ryzen looks same or even little better sometimes, except on virtual machines, virtualization on intel yet are way better
     
  3. Denial

    Denial Ancient Guru

    Messages:
    13,147
    Likes Received:
    2,636
    GPU:
    EVGA RTX 3080
    Made a difference for the 5500XT given the recent fiasco regarding that.
     
    EspHack likes this.
  4. EspHack

    EspHack Ancient Guru

    Messages:
    2,602
    Likes Received:
    90
    GPU:
    ATI/HD5770/1GB
    wifi6, exactly what i was waiting for to upgrade my desktop /s

    what else? why would anyone go with this instead of ryzen? an extra 10fps on older games?
     

  5. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    2,181
    Likes Received:
    636
    GPU:
    RTX 2080 Ti FE
    Intel's done boys...
     
    XenthorX likes this.
  6. ladcrooks

    ladcrooks Master Guru

    Messages:
    369
    Likes Received:
    66
    GPU:
    DECIDING
    And the next squeeze out of this chip with not fat left ?

    Boy oh boy, 5.3 on 2 of them, lets see what the heat maybe and will the warranty on this chip be 1 year?
     
  7. fry178

    fry178 Ancient Guru

    Messages:
    1,618
    Likes Received:
    239
    GPU:
    2080S WaterForceWB
    @squalles
    and? that doesnt mean its the biggest share on the market.
    from all gamers more than 80% are on 1080/60 (maybe even 720 in certain countries),
    only a small part does up to 100/120, even less do +120, even less +144 Hz.
    not even talking about the fact that only really affect shooter games (maybe some simulator), an even smaller group of ppl (out of all gamers).

    so yeah, the 9900 is a great cpu for more than double what i paid for my 3600,
    that can run Siege at 1440p/75hz with maxed settings incl TAA x4, and with fastsync i get steady +120.
     
  8. squalles

    squalles Master Guru

    Messages:
    833
    Likes Received:
    36
    GPU:
    Galax GTX 1080 EXOC OC
    Wrong, who buy a so expensive and top processor obviously want more framerate, if you talking about full hd and 60hz so ryzen 3600x or i5 9400 is sufficient, dont why reason to compare octo core and so expensive processors, obviously the ryzen 3700x and i9 9900k have a specific public
     
    Last edited: Dec 28, 2019
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,480
    Likes Received:
    2,023
    GPU:
    HIS R9 290
    Actually, not sure about Nvidia but there's already evidence that even the 5500 is in need of more bandwidth on x8 lanes (that's all the GPU provides). That's not a particularly impressive GPU by any standards, and there is a measurable performance difference when you compare it from PCIe 3.0 to 4.0 (a 4.0 x8 slot has roughly the same bandwidth as a 3.0 x16 slot), and the 5500 is a PCIe 4.0 GPU).

    Seeing as Nvidia is making substantially more powerful GPUs, I think it's safe to assume that yes, they will in fact be using PCIe 4.0. This is especially true for multi-GPU setups, which they are still working with on servers and workstations.
     
  10. bobblunderton

    bobblunderton Master Guru

    Messages:
    354
    Likes Received:
    167
    GPU:
    EVGA 2070 Super 8gb
    So I state my case here. I do a lot of game-content production here, that's my primary purpose it seems. As of this summer I needed to replace my dated, tired old 4790k with something with more cores.
    i9 9900K was 500$ Decent motherboards were generally to the tune of 30~50$ more than x470 / x570 AMD equal-featured boards.
    Plus you NEED a water cooler or it's going to be throttling constantly, so up goes the price!
    R7 3700X by comparison was 330$ + 50$ discount on the motherboard in addition to boards being generally cheaper.
    I can use the stock cooler which is relatively quiet and also doesn't block my RAM slots / won't LEAK and destroy the system / sound like an aquarium is in here (I'm used to fan noise).
    Considering I need reliability, this precludes me from water cooling and there-for getting the most out of a 9700k/9900k processor.
    I saved somewhere to the tune of 200~250$ while having a quieter system. I don't care about the last 1~8% of FPS I could get by spending another 250$ on the processor / cooling combo, I care that my AMD system was a great deal and very powerful at the same time, having all the features I need. I can even upgrade the processor down the road - something I could NOT do on the last intel system I had. I also have PCI-E 4.0 for the next-gen devices (should I buy one). I didn't want a platform that was a year or two old at that point - I wanted something that just came out with the newest this and that, to keep my machine relevant the longest.
    I don't do competitive online multiplayer high-FPS shooter games, I take my real gun to a real range and practice with it. These types of games lose their appeal when you have satellite internet (out in the boonies), and you start to get close to or hit 40 years old. Never really was crazy about the modern G.I.Joe games* these days, though, for the record (sorry, I didn't deliberately attempt to offend; but I could see it may happen with someone).
    Everyone's use-case differs, but I'd certainly hope the intel beats the AMD chips (listed above) in *something* because it cost somewhere around 200$ more. I will say the difference of this R7 3700X (3000mhz bargain-bin Micron CL-15 RAM) vs my old 4790k (with 2400mhz CL-11 RAM) in content production and general desktop usage is night and day different. Switching tasks is effortless. Zip/Unzip ops are multitudes time faster in getting done, and working with models in Maya or Blender, and graphics / rendering in Substance Designer is amazingly better by a long shot and well worth the price of entry. I absolutely love the system, there's zero issues with it, no disco-tech (a bit off-topic) and there's not much noise either (and no fish-tank noises). My 4790k delidded could barely keep it's turbo speeds with liquid metal above and below the IHS and a 100$ air cooler on it. Overclocking MY HIDE! Sheesh! The AMD will overclock itself when and where needed and back down if it ever hits 80C (rarely). I can't be bothered to test and re-test stability of an overclock when I could spend the time earning money doing content production related stuff.
    Now if intel could sell me a system that let me upgrade as easily, with such a new platform like X570 was this past summer (and still is), without all the heat issues, and still under-cut the competition, I'd be willing to consider it just as anyone else would do.

    *note, for the record, I still enjoy me some old MS-DOS DOOM / Duke 3D / ROTT and even Quake II RTX these days, so shooter games aren't totally out. The old ones were fun though, back when I was a teenager in the 90's. I DO game on this, it is always up to the task and runs anything I throw at it with my RTX 2070 Super in here and bargain-bin 32gb of RAM without hesitation.

    Wait a minute, you need a 100$+ water cooler setup for the 8086k / 8700k / 9700k / 9900k to get the most out of it? Why didn't you just buy a better CPU with more cores / better platform or move to x299 at that rate, or spend less on the CPU and more on the GPU? That all being said, to avoid sounding snide, for those who bought the 9900k or 9700k or similar, DO enjoy your systems, you'll get years out of them yet regardless. But always remember someone has to MAKE the games you play, and those folks (if they are on budget, especially indie studios / developers) might just select a Ryzen in-stead - just as server operators such as the servers serving this message or someone's online multiplayer game server might often soon be an EPYC processor.
     

  11. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,196
    Likes Received:
    138
    GPU:
    1x RTX 3080 FE
    Intel need to stop wasting people’s time with 14nm tech. It’s no wonder AMD are obliterating them right now
     
    angelgraves13 likes this.
  12. squalles

    squalles Master Guru

    Messages:
    833
    Likes Received:
    36
    GPU:
    Galax GTX 1080 EXOC OC
    Its a little relative, like my friend with a ryzen 3700x and rtx 2080 doing same framerate than another with i9 9900kf and rtx 2070 (no super) 200 dollars investment trashed out

    But yes to your purpose, looks a good investment
     
  13. NewTRUMP Order

    NewTRUMP Order Master Guru

    Messages:
    495
    Likes Received:
    125
    GPU:
    STRIX GTX 1080
    At least compare two cpu's that are priced the same--i9-9900k $490 vs Amd-3900x $499. Then see with your own eyes whos stronger. [​IMG]
     
  14. Solfaur

    Solfaur Ancient Guru

    Messages:
    7,411
    Likes Received:
    886
    GPU:
    MSI GTX 1080Ti Ga.X
    Yep, there will probably be a tiny-tiny bit of room left, that will likely not be worth the effort and the power/heat that comes with it. It started with the GPUs, and now it's made its way into CPUs as well. On one hand it's good for the super noobs as they will get what they pay for out of the box, but for "mid-range" enthusiasts, that won't got full h2o/ln2, it's a definite loss. Gone are the days of a i7 920 OC'ed to 4.2 GHz and the like. :oops:
     
    Corbus likes this.
  15. oxidized

    oxidized Master Guru

    Messages:
    233
    Likes Received:
    34
    GPU:
    GTX 1060 6G

    If the 5500XT uses more bandwidth than a much more powerful card, take it up with the card itself, not with the bus.

    https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/

    Within margin of error most of the times with the most powerful card on the market.
     

  16. Tiny_Clanger

    Tiny_Clanger Master Guru

    Messages:
    333
    Likes Received:
    345
    GPU:
    igpu
    Ive no shits to give.
     
  17. Tiny_Clanger

    Tiny_Clanger Master Guru

    Messages:
    333
    Likes Received:
    345
    GPU:
    igpu
    Put money on it.
     
    angelgraves13 likes this.
  18. Denial

    Denial Ancient Guru

    Messages:
    13,147
    Likes Received:
    2,636
    GPU:
    EVGA RTX 3080
    The 5500XT shows more of a difference because the 4GB edition is VRAM starved. It's a niche scenario and arguably that experience should be avoided but it occurs and the bus can help alleviate it if it's available and there is effectively no downside to having it available.

    Even in the example you posted though, games like Hellblade show a 14% increase in performance, Wolfenstein 2 shows a 12% increase - Wildlands shows a 6% increase. The rest are margin of error but those examples aren't. So now we have numerous example of specific titles saturating that bus. Theoretically a next generation card with more performance would increase that difference - so if someone was buying a new CPU/Motherboard now with the intention of keeping it through multiple generations of GPU (given the sandybridge people on Guru3D suggest a number of people do this), I'd definitely recommend PCI4 for that person.
     
    Last edited: Dec 28, 2019
  19. toyo

    toyo Master Guru

    Messages:
    365
    Likes Received:
    190
    GPU:
    Gigabyte 1070Ti 8G
    I can only imagine how many watts these will suck. An average 8700K would run 200W in Blender @5GHz. Some 9900K users can't even stresstest their OCs properly because of the CPU or VRMs throttling, even with Intel's optimizations and STIM. Now add 4 more threads and try that 5.3GHz on all 10 cores.
    I feel they should just start making a special interface for CPUs as well, just like for GPUs, so we can have direct2die factory cooling capable of doing just fine with 300-400W and the VRMs too.
     
  20. oxidized

    oxidized Master Guru

    Messages:
    233
    Likes Received:
    34
    GPU:
    GTX 1060 6G
    Until it's the minority showing any difference there's honestly no argument to be made for this, and you can't know how next gen cards will react to that, you can hypothesize that, but nothing's for certain, and besides next gen will only see 1 card faster than the 2080Ti, and that's probably 3080Ti (or whatever it'll be called), rest will be the same. Anyways i'm not saying that we won't need PCIe 4.0 or 5.0 eventually, but it's just not now, and nor it'll be 2020 because cards won't be magically double the performance we have now, or even 1.5x
     

Share This Page