Intel Core i9-13900K Raptor Lake Engineering Sample Spotted?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2022.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    44,730
    Likes Received:
    11,391
    GPU:
    AMD | NVIDIA
  2. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    Great, so more of the cores that ought to renamed E-waste. Great for laptops, sh1te for gaming cpu's, which ought to have exactly 0 E-cores, and have them replaced with an additional 4 P-cores, which would take up the same amount of die space...
     
    Undying likes this.
  3. Undying

    Undying Ancient Guru

    Messages:
    19,861
    Likes Received:
    8,140
    GPU:
    RTX 2080S AMP
    This wont be any faster for gaming than 12900k if thats the case.
     
    Dragam1337 likes this.
  4. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    Every0ne will end up with a design like that, Intel was just the first one to be forced to do it due to how their processes went to crap the last decade.
     

  5. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    Maybe for mainstream mobile designs, but everyone with even a semi functioning brain can see that they contribute with exactly nothing for gaming, whilst more performance cores would. They will simply have to segment it going forward, instead of trying to shove the same mobile design down desktop users throats...
     
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    I would rather have a few e-cores do all the background tasks quietly without raising the TDP, instead of P-cores constantly moving in and out of processes.
     
  7. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    ...

    The E-waste cores provides NO performance gains for gaming - their only purpose is to reduce TDP in laptops, which is a non-concern for gaming systems...
     
  8. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    When you "game", the OS doesn't stop doing things in the background. Neither do your cores only do game work. It's much better to have 8 performance cores and 4 e-cores than having only 8 performance cores, or 10 performance cores. Them being in separate CPU complexes also means that nothing will disrupt your performance cores. I honestly don't understand the passion. Whatever you see on "mobile" makes sense everywhere, it's just that the desktop is a seriously low priority for most companies.

    Also, they're not e-waste, Intel claims the have 10th gen core performance, which is great if true.
     
  9. bemaniac

    bemaniac Master Guru

    Messages:
    315
    Likes Received:
    19
    GPU:
    MSI Trio X 3090
    Or why not just get the best of both worlds with a Ryzen?
     
    Dragam1337 likes this.
  10. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    No, it isn't...



    You would DEFFO be better off for gaming with an additional 4 P cores vs 16 E-waste cores...

    And while desktop is obviously a low priority for intel, it honestly shouldn't be - there are many many millions of pc gamers worldwide...
     

  11. asturur

    asturur Maha Guru

    Messages:
    1,250
    Likes Received:
    443
    GPU:
    Geforce Gtx 1080TI
    PrMinister explanation was good.
    They do your background task, they don't consume as much energy as a performance core while doing that, meaning that your game can sit on performance cores that can boost longer because of this.
    You claim are useless, while you should accept that:
    - performance cores switching threads will impact your performance negatively
    - they don't make money with the pc you buy for gaming once in a while
    - windows does not revolve around gaming so the actual innefficency of those cores is because process management software still sucks
    - 16 performance cores will just kill your ability to boost high,
    - having 2 separate core complex with their own cache will ensure your game data won't be thrown out of the l2/l3 cache

    Overall they are a great improvement
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    Yeah, except for the fact that they arent... watch the vid.
     
  13. asturur

    asturur Maha Guru

    Messages:
    1,250
    Likes Received:
    443
    GPU:
    Geforce Gtx 1080TI
    In all this video there is no benchmark with 8 performance cores.
    Of course 4p cores are low today, and 6 are better, even without any additional eCore.
     
  14. asturur

    asturur Maha Guru

    Messages:
    1,250
    Likes Received:
    443
    GPU:
    Geforce Gtx 1080TI
    And again, window is at fault, you can't blame technology because software is still tied to the old way of all the cores are the same.
    Like taking a dos game and complaining that the rtx does not work as advertised.
     
  15. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    Is it windows fault? Maybe, though i doubt it... but windows is what we use, so intel should adjust to that reality. And reality is that E-cores are garbage for anything else than saving TDP. For gaming you absolutely are better off swapping 4 E-cores for a P core...

    But it is sadly as @PrMinisterGR said - desktop is simply not the priority, so we get sh!tty scaled up mobile chips...
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    Literally the video is about how the scheduler in Windows 11 is killing performance because it refuses to delegate anything to the E-Cores, even when it has to. I don't think it makes the point you think it does.
     
    schmidtbag likes this.
  17. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    4,582
    Likes Received:
    2,804
    GPU:
    RTX 3090 Strix OC
    That is part of the conclusion, that win11 is garbage (literally the reason you never wanna be an early windows adopter), but the result remains the same... E-waste cores are a non-benefit for gaming vs just having the same amount of die space used for additional P-cores instead.
     
  18. kapu

    kapu Ancient Guru

    Messages:
    5,303
    Likes Received:
    718
    GPU:
    Radeon 6800
    It might be if they improve cache same way AMD is doing .

    But i don't think "gaming performance" is that Important for Intel , just looking at their presentation at CES , you can clearly see they want architecture/cpus to be flexible and well suited for different devices
    I could bet it will have hard time beating 3D version of Zen3. Cache is just too good for gaming.
     
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,837
    Likes Received:
    3,207
    GPU:
    HIS R9 290
    It's an early technology in the x86 world and MS is largely to blame for scheduler issues; AMD faced the same sort of problems. The E-cores are very powerful for how small they are. That being said, they're not all that efficient, but so long as you aren't using any of the more complex instruction sets (which most software doesn't) then these cores are a good way to cram in more performance at a lower cost.
    Games are likely going to be programmed to take advantage of these cores some day, just as they were programmed to take advantage of HT. The experience might not be great now but they're just growing pains.
     
  20. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    6,171
    Likes Received:
    3,384
    GPU:
    RTX 3060 Ti
    the other way round actually
     

Share This Page