1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Intel to Switch to 7nm products in 2021

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 9, 2019.

  1. HWgeek

    HWgeek Master Guru

    Messages:
    417
    Likes Received:
    302
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Look at Intel's stock today- seems like the shareholders and investor's also don't buy this marketing slides- Stock price fell over 25% in last 30 days!.
     
  2. waltc3

    waltc3 Master Guru

    Messages:
    961
    Likes Received:
    278
    GPU:
    XFX 590 8GB XFire
    If not for AMD, there'd have been no x86-64 Core 2 onward--and we'd all be paying through the nose running 64-bit Itanium--assuming Intel was even offering a consumer-based 64-bit Itanium cpu at all, that is. Yes, we could well be running 2GHz 32-bit cpus on our desktops, costing thousands of $, if it had been left up to Intel. Intel is the company that actually ran an advertising campaign entitled: "You don't need 64-bits on the desktop," years ago in response to AMD launching the first x86-64 cpu, the A64. It was through a cross-licensing deal with Intel that Intel got license to x86-64, interestingly enough.
     
    carnivore likes this.
  3. karma777police

    karma777police Active Member

    Messages:
    69
    Likes Received:
    34
    GPU:
    1080

    64bit Itanium was way better idea than x86-x64.
     
  4. karma777police

    karma777police Active Member

    Messages:
    69
    Likes Received:
    34
    GPU:
    1080
    Speaking of AMD 2700x vs 9900k. 9900k is the fastest and the best CPU. Is AMD 2700x good, yes it is but there is better CPU. The price of 9900k justifies it. People forget that the fastest PII used to be $2000 back in a day, and now you get the best and fastest CPU for < $500. Let's cut the bullshit. We will see what Ryzen 3xxx series brings and just to clarify some things. There won't be 12/16 core Ryzen 3xxx on AM4 over dual memory channel. People need to stop spreading that misinformation on Internet. Even if it was possible, 16 core over dual memory channel would literally chock. That is why AMD released ThreadRipper which comes with quad memory channel and it still sucks compared to Intel counter part.

    If you really want to see the difference between 2700x and 9900k...2700x is bottlenecking 2800 ti in 1080p and 1440p resolution and 9900k does not. That's why you pay extra $$$. Let's say there was even faster card than 2800 ti that bottleneck would also happen in 4k resolution.

    Now if you compare 2700x and 9900k with AMD RX580 card...there would be no difference in 2k and 4k still 1080p resolution would favor 9900k.

    I am not trying to bash AMD or anything but I am sick of hearing...Intel is done, they are screwed, AMD is the best bla bla bla bla. You people sound like those Eastern Europeans who bash Intel just because they live in shitty Country and they pay 20% tax on anything therefore Intel CPU will be always out of reach for them. It is not Intel problem...
     
    Last edited: May 9, 2019

  5. Warrax

    Warrax Member Guru

    Messages:
    128
    Likes Received:
    17
    GPU:
    Gigabyte 970 G1
    AMD need to take advantage of that 2 years with Zen2, it's their opportunity to take more market share while Intel is trying to come back with a better arch. Competition is good!
     
    HitokiriX likes this.
  6. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,730
    Likes Received:
    1,072
    GPU:
    2 x GeForce 1080 Ti
    Excellent. We can expect 7nm desktop chips in 2025 after only the tenth delay ;)

    Then how come it failed? :p
     
  7. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,445
    Likes Received:
    994
    GPU:
    RTX 2070FE
    Can you not come here with that attitude and bigot comments?

    But look at the price between a 9900k and a 2700x. At that price, the damn thing should perform better. High end is not the problem, and never has been. Considering 9900k is HDET not fair to really compare them anyway. High end is the niche market that has no real value per dollar or anything like that, the problem is AMD has shown it you can actually create mainstream processors at a lower cost while still maintaining value. And 1080p higher refresh rate is one thing, that's fine. But even in the mainstream market again you don't see the average consumer buying an i7 or a Ryzen 8 core.

    Threadripper also has its purposes, doesn't need to be the best at everything, but for a workstation grade CPU for the cost it's decent value.
     
    Last edited: May 9, 2019
  8. Aura89

    Aura89 Ancient Guru

    Messages:
    7,477
    Likes Received:
    791
    GPU:
    -
    Yes, because we all know that any benchmark of zen/zen+ if compared to core 2 processors somehow magically shows the core 2 processors on top....


    Do you even think before you type?

    No, its not. There are penty of intel and amd processors that are faster then the 9900k

    The only way someone can claim the 9900k is the "fastest and best cpu" is by ignoring cores altogether, only caring about single-threaded performance.

    But thats not reality, thats a narrowminded thought process that doesn't exist in reality.

    Now, you can say for specifically what YOU individually want its the fastest processor for the tasks YOU want to do, but you can't claim its the "fastest and best cpu"
     
    Last edited: May 9, 2019
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,035
    Likes Received:
    491
    GPU:
    GTX 1080ti
    so its final, 10nm is dead.
     
  10. Dimitrios1983

    Dimitrios1983 Master Guru

    Messages:
    207
    Likes Received:
    49
    GPU:
    AMD RX560 4GB
    You sure about that? To be honest with the small size of AMD and almost going bankrupt they are neck to neck with INTEL and I find it "pretty embarrassing" that with INTEL's size and cash are struggling to compete with little AMD.

    Any who I hope that with INTEL grabbing many AMD workers they can finally make a decent GPU after like 30-40 years it's just sad not having a GPU. We need INTEL to make a good GPU because that sector is getting stale.
     
    HWgeek and Aura89 like this.

  11. Mundosold

    Mundosold Active Member

    Messages:
    85
    Likes Received:
    27
    GPU:
    eVGA GTX 560
    Good attempt, 8/10.
    OK now you are just trying too hard!
     
  12. angelgraves13

    angelgraves13 Maha Guru

    Messages:
    1,167
    Likes Received:
    244
    GPU:
    RTX 2080 Ti FE
    Intel's slowly dying. Oh wait...let's try to spin this into positive news...

    That's all I read.
     
    Last edited: May 11, 2019
  13. Petr V

    Petr V Member Guru

    Messages:
    197
    Likes Received:
    37
    GPU:
    Gtx over 9000
    Really?
     
  14. Mundosold

    Mundosold Active Member

    Messages:
    85
    Likes Received:
    27
    GPU:
    eVGA GTX 560
    I think if Intel graphic cards are a failure, Intel and AMD are gonna swap market positions.
    X86-64 is slowly dying, Apple A12X is already biting at the heels of X86 desktop performance. If AMD is the clear leader of X86, where does that leave Intel if Xe fails too?

    If ARM gets fast enough to emulate X86 'fast enough' for legacy apps, X86 be gone in 10 years.
     
  15. Aura89

    Aura89 Ancient Guru

    Messages:
    7,477
    Likes Received:
    791
    GPU:
    -
    Apple A12X isnt biting at the heals of x86 performance. This idea came about with many articles showing the newest ipad with an A12x processor is able to almost match, in geekbench, vs a macbook pro, powered by intel.

    What many articles leave out is any information on the macbook and the A12x

    The A12x is able to match, sometimes beat but mostly lose to the macbook pro, this in itself doesnt sound terribly bad....however, its not that great when you see what they are comparing.

    They are comparing a quad-core intel processor to an 8-core A12x processor. This means the A12x needs double to cores to get relatively close to intels 4 core cpu. This isn't as impressive as the articles make out.

    I'll grant that the single core performance is decent, but not specifically all that good either. It has 4 cores running at 2.5ghz and 4 cores running at 1.6ghz, with the intel running at 2.7ghz. this puts the A12x mostly behind, sometimes as much as half the speed, with only 200mhz difference. But again, some cases put them pretty close.

    Overall, from a desktop standpoint, this would likely be a mixed bag result, some things would be fast, where most things likely wouldn't, only thing i can compare it to is the bulldozer, except opposite, where the single core performance can be good but not always, and the multicore performance pretty universally not that great, compared to modern desktop CPUs.
     
    Last edited: May 11, 2019

  16. Venix

    Venix Master Guru

    Messages:
    863
    Likes Received:
    291
    GPU:
    Palit 1060 6gb
    Can not talk personally about Xe they are far away from release and we have 0 credible information even ballpark numbers about performance or consumption , now A12x arm cpu is long long long long long long long way to catch up 8 core desktop cpus , the jump from x86 to anything else is not as easy as it sounds especially when everything is made for it to this date thats why arm needs a really good translator for the transitional period . Now about Intels death , how people even remotely believe that ? Intel is still deep in the Green so they are still a profitable company and their processors are and will be competitive to Amd's even if the ryzen 3000 come out and the difference is as big it was with Bulldozer vs Sandy Bridge , Amd managed to survive that, what makes you think Intel can't survive that ? Let's be realists here people !
     
  17. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    507
    Likes Received:
    54
    GPU:
    RTX 2080Ti 11Gb
    If Ryzen 16 core is a real CPU and its not some slowpoke with sub 4.5Ghz ill be buying AMD first time since 1060T and im intels faithful been getting new one every year, 9900K>8700k>5280k>2600k>dont remember but was some intel
    Actually ill be buying two ryzen systems, one i wanted to buy now but decided to wait till they get new CPU and buy all new gen, one for my gaming and editing 16 core hopefully overclockable to 5Ghz, i have brand new custom water kit that supposed to go on my 2080ti and 9900k but ill keep it boxed and thankfully the PU block can be converted to AMD
    And second Ryzen will be one of their cheap new gen 6 cores or 8 cores even, maybe with GPU to be my 4K/HDR capturing and NAS in the same box [already got me a couple of 5gigabyt Ethernet cards, its more then enough for HDD speeds, actually twice faster then HDD speed so i could load games and programs from NAS like from local HDD and its only PCIe x1 so ill always have one free no matter the board]
     
  18. illrigger

    illrigger Member Guru

    Messages:
    122
    Likes Received:
    32
    GPU:
    GTX 1080 Ti
    Naah. They aren't releasing desktop chips on 10nm until 2020-2021, so it will have a long life in the server and desktop market before they move them to 7nm.

    And of course you have to remember that 10nm was announced in 2014 and supposed to ship in 2016 and there are still no production chips 3 years later, so if you think they will suddenly get the kinks out of 7nm and get actual full-production fabs up and running by 2022 when it's just in the planning stages now, I have a bridge to sell you. They will be milking 10nm for a long while yet, and TSMC is already spinning up 5nm for ARM chips and AMD.
     
  19. illrigger

    illrigger Member Guru

    Messages:
    122
    Likes Received:
    32
    GPU:
    GTX 1080 Ti
    Add to all that the fact that while ARM chips are very speedy in a small environment with apps that are optimized to use a small memory footprint, they drop behind extremely fast when you throw any substantial compute load on them. There are ARM chips in datacenters, and they have basically one use: presenting a lot of individual instances of a small dataset - in other words, they are used pretty much exclusively as web servers that pull data from X86/Power based servers. They usually come with a dozen or more chips on a blade, and each chip is its own web server. That's really all they are good for, and in the end they aren't any more efficient than multi-core x86/Power chips running under virtualization so pretty much nobody uses them.

    Look no further than the Raspberry Pi if you want more examples. While a much simpler chip than the A-series chips from Apple, the limitations are similar - how much RAM can be addressed cripples their use outside of a niche set of work, and scaling up to more "big" cores doesn't work any better than it does on the Intel side. This will probably change as time goes on, but ARM in any form is still a decade or more from being more than a glorified Netbook chip for light browsing and task work.
     
    HandR and Aura89 like this.

Share This Page