Intel to Switch to 7nm products in 2021

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 9, 2019.

  1. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    If not for AMD, there'd have been no x86-64 Core 2 onward--and we'd all be paying through the nose running 64-bit Itanium--assuming Intel was even offering a consumer-based 64-bit Itanium cpu at all, that is. Yes, we could well be running 2GHz 32-bit cpus on our desktops, costing thousands of $, if it had been left up to Intel. Intel is the company that actually ran an advertising campaign entitled: "You don't need 64-bits on the desktop," years ago in response to AMD launching the first x86-64 cpu, the A64. It was through a cross-licensing deal with Intel that Intel got license to x86-64, interestingly enough.
     
    carnivore likes this.
  2. Warrax

    Warrax Member Guru

    Messages:
    142
    Likes Received:
    30
    GPU:
    GTX 1070Ti
    AMD need to take advantage of that 2 years with Zen2, it's their opportunity to take more market share while Intel is trying to come back with a better arch. Competition is good!
     
    HitokiriX likes this.
  3. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Excellent. We can expect 7nm desktop chips in 2025 after only the tenth delay ;)

    Then how come it failed? :p
     
  4. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Can you not come here with that attitude and bigot comments?

    But look at the price between a 9900k and a 2700x. At that price, the damn thing should perform better. High end is not the problem, and never has been. Considering 9900k is HDET not fair to really compare them anyway. High end is the niche market that has no real value per dollar or anything like that, the problem is AMD has shown it you can actually create mainstream processors at a lower cost while still maintaining value. And 1080p higher refresh rate is one thing, that's fine. But even in the mainstream market again you don't see the average consumer buying an i7 or a Ryzen 8 core.

    Threadripper also has its purposes, doesn't need to be the best at everything, but for a workstation grade CPU for the cost it's decent value.
     
    Last edited: May 9, 2019

  5. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Yes, because we all know that any benchmark of zen/zen+ if compared to core 2 processors somehow magically shows the core 2 processors on top....


    Do you even think before you type?

    No, its not. There are penty of intel and amd processors that are faster then the 9900k

    The only way someone can claim the 9900k is the "fastest and best cpu" is by ignoring cores altogether, only caring about single-threaded performance.

    But thats not reality, thats a narrowminded thought process that doesn't exist in reality.

    Now, you can say for specifically what YOU individually want its the fastest processor for the tasks YOU want to do, but you can't claim its the "fastest and best cpu"
     
    Last edited: May 9, 2019
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    so its final, 10nm is dead.
     
  7. Dimitrios1983

    Dimitrios1983 Master Guru

    Messages:
    348
    Likes Received:
    114
    GPU:
    RX580
    You sure about that? To be honest with the small size of AMD and almost going bankrupt they are neck to neck with INTEL and I find it "pretty embarrassing" that with INTEL's size and cash are struggling to compete with little AMD.

    Any who I hope that with INTEL grabbing many AMD workers they can finally make a decent GPU after like 30-40 years it's just sad not having a GPU. We need INTEL to make a good GPU because that sector is getting stale.
     
    HWgeek and Aura89 like this.
  8. Mundosold

    Mundosold Master Guru

    Messages:
    243
    Likes Received:
    108
    GPU:
    RTX 3090 FE
    Good attempt, 8/10.
    OK now you are just trying too hard!
     
  9. Petr V

    Petr V Master Guru

    Messages:
    358
    Likes Received:
    116
    GPU:
    Gtx over 9000
    Really?
     
  10. Mundosold

    Mundosold Master Guru

    Messages:
    243
    Likes Received:
    108
    GPU:
    RTX 3090 FE
    I think if Intel graphic cards are a failure, Intel and AMD are gonna swap market positions.
    X86-64 is slowly dying, Apple A12X is already biting at the heels of X86 desktop performance. If AMD is the clear leader of X86, where does that leave Intel if Xe fails too?

    If ARM gets fast enough to emulate X86 'fast enough' for legacy apps, X86 be gone in 10 years.
     

  11. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Apple A12X isnt biting at the heals of x86 performance. This idea came about with many articles showing the newest ipad with an A12x processor is able to almost match, in geekbench, vs a macbook pro, powered by intel.

    What many articles leave out is any information on the macbook and the A12x

    The A12x is able to match, sometimes beat but mostly lose to the macbook pro, this in itself doesnt sound terribly bad....however, its not that great when you see what they are comparing.

    They are comparing a quad-core intel processor to an 8-core A12x processor. This means the A12x needs double to cores to get relatively close to intels 4 core cpu. This isn't as impressive as the articles make out.

    I'll grant that the single core performance is decent, but not specifically all that good either. It has 4 cores running at 2.5ghz and 4 cores running at 1.6ghz, with the intel running at 2.7ghz. this puts the A12x mostly behind, sometimes as much as half the speed, with only 200mhz difference. But again, some cases put them pretty close.

    Overall, from a desktop standpoint, this would likely be a mixed bag result, some things would be fast, where most things likely wouldn't, only thing i can compare it to is the bulldozer, except opposite, where the single core performance can be good but not always, and the multicore performance pretty universally not that great, compared to modern desktop CPUs.
     
    Last edited: May 11, 2019
  12. Venix

    Venix Ancient Guru

    Messages:
    3,472
    Likes Received:
    1,972
    GPU:
    Rtx 4070 super
    Can not talk personally about Xe they are far away from release and we have 0 credible information even ballpark numbers about performance or consumption , now A12x arm cpu is long long long long long long long way to catch up 8 core desktop cpus , the jump from x86 to anything else is not as easy as it sounds especially when everything is made for it to this date thats why arm needs a really good translator for the transitional period . Now about Intels death , how people even remotely believe that ? Intel is still deep in the Green so they are still a profitable company and their processors are and will be competitive to Amd's even if the ryzen 3000 come out and the difference is as big it was with Bulldozer vs Sandy Bridge , Amd managed to survive that, what makes you think Intel can't survive that ? Let's be realists here people !
     
  13. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    If Ryzen 16 core is a real CPU and its not some slowpoke with sub 4.5Ghz ill be buying AMD first time since 1060T and im intels faithful been getting new one every year, 9900K>8700k>5280k>2600k>dont remember but was some intel
    Actually ill be buying two ryzen systems, one i wanted to buy now but decided to wait till they get new CPU and buy all new gen, one for my gaming and editing 16 core hopefully overclockable to 5Ghz, i have brand new custom water kit that supposed to go on my 2080ti and 9900k but ill keep it boxed and thankfully the PU block can be converted to AMD
    And second Ryzen will be one of their cheap new gen 6 cores or 8 cores even, maybe with GPU to be my 4K/HDR capturing and NAS in the same box [already got me a couple of 5gigabyt Ethernet cards, its more then enough for HDD speeds, actually twice faster then HDD speed so i could load games and programs from NAS like from local HDD and its only PCIe x1 so ill always have one free no matter the board]
     
  14. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    Naah. They aren't releasing desktop chips on 10nm until 2020-2021, so it will have a long life in the server and desktop market before they move them to 7nm.

    And of course you have to remember that 10nm was announced in 2014 and supposed to ship in 2016 and there are still no production chips 3 years later, so if you think they will suddenly get the kinks out of 7nm and get actual full-production fabs up and running by 2022 when it's just in the planning stages now, I have a bridge to sell you. They will be milking 10nm for a long while yet, and TSMC is already spinning up 5nm for ARM chips and AMD.
     
  15. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    Add to all that the fact that while ARM chips are very speedy in a small environment with apps that are optimized to use a small memory footprint, they drop behind extremely fast when you throw any substantial compute load on them. There are ARM chips in datacenters, and they have basically one use: presenting a lot of individual instances of a small dataset - in other words, they are used pretty much exclusively as web servers that pull data from X86/Power based servers. They usually come with a dozen or more chips on a blade, and each chip is its own web server. That's really all they are good for, and in the end they aren't any more efficient than multi-core x86/Power chips running under virtualization so pretty much nobody uses them.

    Look no further than the Raspberry Pi if you want more examples. While a much simpler chip than the A-series chips from Apple, the limitations are similar - how much RAM can be addressed cripples their use outside of a niche set of work, and scaling up to more "big" cores doesn't work any better than it does on the Intel side. This will probably change as time goes on, but ARM in any form is still a decade or more from being more than a glorified Netbook chip for light browsing and task work.
     
    HandR and Aura89 like this.

Share This Page