1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Intel to halt Extreme Edition branding for its processors

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 10, 2018 at 8:40 AM.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    32,635
    Likes Received:
    1,727
    GPU:
    AMD | NVIDIA
  2. alanm

    alanm Ancient Guru

    Messages:
    7,832
    Likes Received:
    400
    GPU:
    1070 AMP!
    Could be to forestall potential embarrassment when lesser chips outperform the Extremes at lower cost. Something AMD has shown able to do lately.
     
    Embra and fantaskarsef like this.
  3. Belfaborac

    Belfaborac Member

    Messages:
    40
    Likes Received:
    9
    GPU:
    Nvidia GTX 980 Ti
    But, but.....how are us l33t dudes gonna distinguish ourselves from the herd now? Am I going to have to lead with the price tag when people ask me what CPU I'm running?
     
    CalculuS, Clawedge and Keitosha like this.
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    8,512
    Likes Received:
    795
    GPU:
    1080Ti @h2o
    Exactly my thinking now, maybe they're not as extreme any more. :D
     
    alanm likes this.

  5. sverek

    sverek Ancient Guru

    Messages:
    3,908
    Likes Received:
    619
    GPU:
    NOVIDIA -0.5GB
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,757
    Likes Received:
    519
    GPU:
    -NDA +AW@240Hz
    That's what happens when you bump clock on lower end chips with each generation instead of actually improving IPC in way it makes sense.
    Difference between regular and XE chips is not exactly extreme.
     
    schmidtbag likes this.
  7. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    8,512
    Likes Received:
    795
    GPU:
    1080Ti @h2o
    Well the extreme CPUs had advantages that were more than clocks, usually cores back then, as well as tripple / quad channel RAM support, more PCIe lanes etc., which most of the time doesn't have much sense for gaming purposes, but maybe more gaming / workstation hybrid builds. I usually bought the "enthusiast" platform because honestly, I don't need an iGPU, and I don't want one. So with Intel I was able to avoid that dead die space for me due to picking CPUs which did not offer it.
     
  8. Embra

    Embra Master Guru

    Messages:
    584
    Likes Received:
    51
    GPU:
    Sapphire Nitro Fury
    Awww... but the "Extreme" sounded special.

    So that price premium is gone now?
     
  9. sverek

    sverek Ancient Guru

    Messages:
    3,908
    Likes Received:
    619
    GPU:
    NOVIDIA -0.5GB
  10. Fediuld

    Fediuld Member

    Messages:
    21
    Likes Received:
    8
    GPU:
    MSI 1080 Armor OC /8GB
    Why also Intel doesn't come out with the death of X299?
    Everyone and his dog know that they wont maintain 2 HEDT platforms when the new 28 core CPUs come out in different socket than 2066.
     

  11. Fediuld

    Fediuld Member

    Messages:
    21
    Likes Received:
    8
    GPU:
    MSI 1080 Armor OC /8GB
    If correct the 8 core mainstream CFL is i9 now lol.
     
  12. sverek

    sverek Ancient Guru

    Messages:
    3,908
    Likes Received:
    619
    GPU:
    NOVIDIA -0.5GB
    It even has huge "X" on the damn box. cmon people, brighten up!

    [​IMG]
     
    lucidus and schmidtbag like this.
  13. DeskStar

    DeskStar Master Guru

    Messages:
    349
    Likes Received:
    8
    GPU:
    4 eVGA GTX TITAN SC
    This guy just have seen the writing on the wall......unless.....!! He's the one who put the writing there.....!?!?

    Maybe he's the one who has helped with the "hardware issues" as of late...? Hmmm ' "ex employee of twenty some years" dropping hints on things... I

    Just kidding....., but wouldn't that be crazy... Sabotage at its finest.
     
  14. ubercake

    ubercake Member Guru

    Messages:
    177
    Likes Received:
    35
    GPU:
    Asus GTX 1080 FE
    Right? It's no longer "Extreme" when the competition is doing it on the regular.
     
  15. D3M1G0D

    D3M1G0D Master Guru

    Messages:
    889
    Likes Received:
    338
    GPU:
    2 x GeForce 1080 Ti
    Makes sense. After all, there's little to differentiate between a Core i9 7980XE Extreme Edition chip and a Core i9 7960X X-series chip. Or maybe they wanted to reserve the XE label for their new supposedly 5 GHz 28 core chip?
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    2,975
    Likes Received:
    360
    GPU:
    HIS R9 290
  17. JamesSneed

    JamesSneed Master Guru

    Messages:
    271
    Likes Received:
    67
    GPU:
    GTX 1070
    Makes total sense if you are going 8-cores on mainstream non-extreme like Intel is said to be doing by September. Especially once they land on 10nm they will have say 10 cores/20 threads mainstream at some point. I think a much more mature sounding name / branding is in order like simply i9 since the only people needing more than 8-cores/16-threads are those doing real work or top percent of the enthusiasts. No point in trying to get average gamers into the XE chips these days.
     
  18. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    5,495
    Likes Received:
    152
    GPU:
    MSI GTX 1070
    Everything from Intel is Extreme nowdays, the leaks, the price...
     
  19. D3M1G0D

    D3M1G0D Master Guru

    Messages:
    889
    Likes Received:
    338
    GPU:
    2 x GeForce 1080 Ti
    XE chips were never for the average gamer. They were used to indicate the top chip for the HEDT or enthusiast class (e.g., 7980XE as the top Core i9). My guess is that they no longer want to give a chip this designation considering the rapid changes that are happening and the competition that they are facing in the HEDT market. AMD has severely disrupted Intel's business model, and they're going to need some time to adjust.
     
  20. FrostNixon

    FrostNixon Member Guru

    Messages:
    117
    Likes Received:
    6
    GPU:
    GT 555M 1GB
    Well no one needs more than 4 cores anyway so it doesn't really matter.
     

Share This Page