Intel 10th Gen Core "Comet Lake" Processor Lineup Revealed Incl. Prices and Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 10, 2019.

  1. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    @Hilbert Hagedoorn chief, KitGuru once fell for a 4Chan slide. I am talking about this one in particular:



    [​IMG]

    [​IMG]


    Computerbase too, can be duped by someone pretending to be a leaker or someone whom they trust as a source who bought into fake information.
     
  2. Strange Times

    Strange Times Master Guru

    Messages:
    372
    Likes Received:
    110
    GPU:
    RX 6600 XT
    great lineup but.... 14
     
  3. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    Many questions raised if this is true (very unlikely IMHO).

    1) How the hell would they be able to offer an i3 with 8 threads with a TDP of 65W if their current 9th gen i3 already hit 65W without HT at similar clock speeds? Same question for the i5s.
    2) What black magic would allow them to have this "i9-10800F" with 20 threads @65W? That's (probably) physically impossible, all things considered (chip design, density, etc).
    3) This "i7-10700K" looks an awful lot like the 9900K/F ones, except they would allegedly have 100mhz more single core boost while costing $100 less.

    Is this fairyland? Where are my unicorns?

    It would be especially funny if this was true, since this "14+++nm" name would definitively cement all the memes about Intel. Very unlikely though, oh well.
     
  4. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    Naming is terrible, while the 14+++ is hilarious.
     

  5. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    And that's one socket too many. I didn't say so at the time, but I thought what Intel did with Z370 was almost criminal - needlessly blocking off older CPUs and needlessly creating confusion. It was so bad but I almost suspected that they were deliberately trolling their loyal users, seeing what they could get away with and how many of their users they could fleece (like the ludicrous pricing of the 6950K). In fact, this is one of the things that make this leak suspicious - why would Intel change the socket when they can just block off older CPUs like before?
     
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I was very confused as it seemed like you were arguing points no one was bringing up. Then i realized, you're arguing with someone i have on ignore.

    Ignore sure makes this place more pleasant, weeding out all the trolls, but it certainly makes it strange to read, as i literally don't even have an indication you quoted someone lol
     
  7. oxidized

    oxidized Master Guru

    Messages:
    234
    Likes Received:
    35
    GPU:
    GTX 1060 6G
    Rofl one socket too many - Just as much as AMD, but nobody reserves the same treatment for AMD for some reason
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    1) Not a problem, intel's TDP is not to be taken seriously. +some small manufacturing process improvements
    2) Base clock of 2.7GHz only
    3) Yes.

    You'll ahve to draw those unicorns for us. But I prefer Fairies, well developed :)
    Justification. Blocking upgrades for no good reason will result in them taking flak as they did before.
    I think that some mythical reason for new socket will persuade more people that it was justified, than making new chipset with 2 extra USB slots :D
     
    Ricardo likes this.
  9. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,513
    Likes Received:
    2,355
    GPU:
    Nvidia 4070 FE
    Reading all the posts, it certainly seems like this was a fake slide. I suppose it means, theoretically, Intel could include PCIe 4.0 as well. Although I seem to recall reading somewhere that Intel was interested in PCIe 5.0 already.
     
  10. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    I call BS on this slide. Im going to ignore it until we see something oficial from Intel. This just doesn't feel right especially the awkward naming scheme as Intel is way to much into marketing to do that.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,976
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    lol I've never used Ignore (on any forum) so that is a bit interesting to know how that works.
    I don't consider oxidized a troll, I think there's usually just a matter of miscommunication. Doesn't necessarily mean I agree with his opinions, but he's entitled to them and I'm not interested in changing his mind (nor do I really have a right to do so). I try to only argue against misinformation treated like facts.
     
  12. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    Im 99% sure PCIE 5 will go into servers first and it will be many years before we see it in desktops. It just costs a good deal more to implement and PCIE 4 is already a bit higher cost for motherboards which we are seeing with Ryzen 3xxx. The increased signaling causes more motherboard layers so they can have nice short traces and the traces need really clean signal integrity. Plus nobody in desktops needs that kind of bandwidth for normal things and if you are going that exotic you can afford a HEDT build or server build. Speaking of when Threadripper comes out for gen 3xxx it should have 60 PCIE 4 lanes which is insane bandwidth.
     
  13. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    What I find funny is you talking to someone about missing something because of the user being on ignore when the user you are quoting and talking about ignore is on someone else's ignore list. Like mine.

    I literally could not grasp what this post was about until i clicked the tab at the bottom of the page
     
    fantaskarsef and schmidtbag like this.
  14. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    Except Z270s can be made to support 8th gen and 9th gen CPUs, through a simple BIOS mod. So far it's been limited to ASRock motherboard's only, but before the release of the Z370 chipset-based motherboards, a rep from ASUS expressed that they were ready to ship UEFI updates that made 8th gen CPUs work with Z170 and Z270 motherboards, it's intel who prevented it.
     
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You both mean the same thing :)
     

  16. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    You are comparing mobile vs desktop, that alone is a no go.


    I saw 495 chipset has 5gbs and 10gbs usb3 ports.. isn't that 3.0 and 3.1


    Edit: yes.
    https://www.google.com/url?sa=i&sou...aw2_sstAZ4BVeIaEbON_jRIS&ust=1562869592700865
     
  17. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    That's precisely what I mean. Intel artificially blocked these chips when there was no valid reason for it. Forcing Z170 and Z270 owners to unnecessarily throw out their perfectly functional boards for Coffee Lake was a complete d**k move by Intel.
     
  18. RzrTrek

    RzrTrek Guest

    Messages:
    2,548
    Likes Received:
    741
    GPU:
    -
    Another 14nm (+++++) refresh without backward compatibility. I hope it's fake for Intel's sake, or they're in for some flak.
     
    Last edited: Jul 10, 2019
  19. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    Every + is a time they missed the node shrink to 10nm

    Intel have a chipset business to run also, so forcing people to upgrade is what they do, coffee lake has been used working on a Z170 hacked/leaked BIOS means it can work but then wheres the profit in that? Motherboard makers love Intel because forcing people to buy a new motherboard keeps them in business also.
    No profit to be made if people can use their existing motherboards. AMD doesn't have this kind of mindshare so making their platforms backward compatible is the only leg up they have besides price. Although you would think corporations would be all over that just upgrading the processor without the need in decommissioning entire
    units for a small bump in performance then taking the units to be secure wiped then scrapped.
     
  20. loser7

    loser7 Guest

    Messages:
    70
    Likes Received:
    1
    GPU:
    GTX 1070
    This is being reported as fake by legit sources. 14nm+++? Fake.


    Also, look at the fine print in the image and then you will know its fake. The fine print says "November 2007"
     
    Last edited: Jul 10, 2019

Share This Page