Alleged Intel Alder Lake CPU photo surfaces online

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 15, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    39,473
    Likes Received:
    8,114
    GPU:
    AMD | NVIDIA
  2. LEEc337

    LEEc337 Active Member

    Messages:
    90
    Likes Received:
    18
    GPU:
    Team Green 960
    An interesting approach hope this plays out well for Intel maybe even some Xe cores too
     
  3. Strange Times

    Strange Times Master Guru

    Messages:
    289
    Likes Received:
    85
    GPU:
    RX 580 UV
    os needs to learn how to choose the right cores
     
  4. NightWind

    NightWind Master Guru

    Messages:
    234
    Likes Received:
    192
    GPU:
    Asus ROG 2060 Super

  5. Bo Alenkaer

    Bo Alenkaer Member

    Messages:
    10
    Likes Received:
    3
    GPU:
    GTX1080Ti @2114/6.2
    I seriously doubt that Intel can make PCIe 5 work already. They could not even make PCIe Gen 4 work for 10th Gen CPU´s. The extra pin count is needed for DDR5's high transfer rate. They will probably have at least 350-400 pins per Dimm.
     
    Luc likes this.
  6. Luc

    Luc Active Member

    Messages:
    90
    Likes Received:
    55
    GPU:
    RX 480 | Gt 710
    They will need more space for mixing too many things and different architecture chiplets, including those 8 bigger hotter cores.

    There will be a tremendous change in everything, and I'm afraid I don't wan't to be a beta tester of new DDR 5, PCIE 5, Big Little, etc.

    I'll try to go for the last DDR 4 architecture with Zen 3 and the second generation DDR 5 on mobile with nice drawing screen whatever, maybe when they'll have right prices... :(
     
    Last edited: Oct 15, 2020
  7. Silva

    Silva Maha Guru

    Messages:
    1,291
    Likes Received:
    484
    GPU:
    Asus RX560 4G
    I fail to understand your point.

    First time I learn about this High efficiency + High performance symbiosis for PC. We have it on phones and it's great, if it works correctly.
    As stated before, OS needs to learn when to use what and for what applications. The software side is still so late compared to the hardware, I'm certain Linux will master this much faster than Windows ever will.
    And we need software to improve to see any benefits from this.

    Great to see Intel trying to push new things, next couple years will be interesting for sure!
     
    Luc likes this.
  8. NightWind

    NightWind Master Guru

    Messages:
    234
    Likes Received:
    192
    GPU:
    Asus ROG 2060 Super
    A metaphor for more heat.
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,482
    Likes Received:
    2,024
    GPU:
    HIS R9 290
    I just realized but, I know that a large chunk of these pins are for voltage and ground. The thing is, why not just make those as a big plate, instead of a pin? I imagine that would help make things simpler, cheaper, and more reliable. It would also be pretty much impossible to mistake the direction the CPU should be installed.
     
  10. karma777police

    karma777police Master Guru

    Messages:
    237
    Likes Received:
    81
    GPU:
    1080
    PCIE 4 as I said turned out to be really f. useless so useless that 3080 performs same under both PCIE3 and PCIE4. Intel made a smart decision almost skipping PCIE4 all together and going for PCIE5 and DDR5 and irony is they will jump to it before AMD does.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    13,150
    Likes Received:
    2,647
    GPU:
    EVGA RTX 3080
    The idea that PCI-E 4 is only for GPUs is a really dumb take, even for you.
     
  12. fellix

    fellix Member Guru

    Messages:
    183
    Likes Received:
    23
    GPU:
    KFA² GTX 1080 Ti
    Most of the extra pins are for additional power/ground capacity. Only small portion is dedicated for the I/O signal overhead.
     
  13. MonstroMart

    MonstroMart Master Guru

    Messages:
    815
    Likes Received:
    300
    GPU:
    GB 5700 XT GOC 8G
    I wish Intel would have logical names for their cpu like core 1st gen, core 2n gen, core 3rd gen and so on instead of rocket lake, magical lake, fairy lake, i don't know wtf this lake is anymore ...
     
  14. Francesco

    Francesco Active Member

    Messages:
    89
    Likes Received:
    33
    GPU:
    Sapphire 4870
    So they are finally preparing to move to a smaller fab process but CPU is getting bigger.
    Kind of irony.
     
  15. icedman

    icedman Maha Guru

    Messages:
    1,053
    Likes Received:
    115
    GPU:
    MSI Duke GTX 1080
    Sad thing about this statement is Intel skipped it and their mobos still seem to have a premium in the majority of cases where as u have the option to take it or leave it with AMD.

    As for this little big design I really want to see how this is going to be leveraged I can see mobile benefiting from it but I fail to see a real use for it for the vast majority of desktops. Just seems like it will fill some kind of a niche.
     

  16. Syranetic

    Syranetic Master Guru

    Messages:
    599
    Likes Received:
    137
    GPU:
    Strix RTX 3080
    I agree, it's annoying. I've still stuck with Intel, mostly for gaming performance and familiarity, but its things like this that keep stacking up against them. When single threaded performance is worse, and multi-threaded, and the cost is higher, etc it will become really hard to stomach sticking with them.
     
  17. Trihy

    Trihy Member Guru

    Messages:
    119
    Likes Received:
    12
    GPU:
    Onboard
    Different size probably need new heatsink.

    So if you are on gen 10, you will have to change everything. Just a year after.

    Or stay on amd.
     
  18. p1stov

    p1stov Member

    Messages:
    20
    Likes Received:
    0
    GPU:
    nvidia palit 560 2GB
    PCI-E 4 is not just for GPU, but any component you can connect to PCI slots even the new NVME generations that have speeds of 5000mb/s
     
  19. Kaerar

    Kaerar Master Guru

    Messages:
    357
    Likes Received:
    45
    GPU:
    5700XT
    Why would you stick with them? They aren't your local football team or anything like that.

    I've got both Intel and AMD systems depending on the use case or what's lying around.

    It's a sad fact that my Xeon 1680v2 is equal to any Ryzen 1st gen CPU in single core perf. It's taken Ryzen 3000 to catch Intel in single core and now Ryzen 5000 is going to take the lead. It's just hardware, having a tribal attitude towards it won't benefit you and your use case.
     
    carnivore and sykozis like this.
  20. Syranetic

    Syranetic Master Guru

    Messages:
    599
    Likes Received:
    137
    GPU:
    Strix RTX 3080
    Because I said "when", I will still take my 10700K over any other currently available AMD chip for my primary workload with this system -- gaming.
     

Share This Page