Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 26, 2021.

  1. Strijker

    Strijker Active Member

    Messages:
    52
    Likes Received:
    12
    GPU:
    Gigabyte 3060Ti
    Slight note, single (&multi) core performance of the AMD 5xxx processors got slightly better after a bios update (where the cache speeds works much faster).
    But damn impressive that new M1 chip. Wished we had that chip for pc's (RIP Intel & AMD though then most likely). I wonder with kickass cooling & more power you could even get much higher...
     
    Last edited: Mar 26, 2021
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,563
    Likes Received:
    2,909
    GPU:
    HIS R9 290
    A trend in what, specifically?
    These days, computer hardware is becoming less and less interesting. The days of overclocking are nearing an end (now it's basically just providing a cooling solution capable of maintaining boost clocks). All motherboards are pretty much the same thing with ever-so-slightly different variations, and even then, they're mostly just black with RGB heatsinks. I wouldn't be surprised if in a few years, all motherboards come with soldered-on CPUs.

    It wouldn't matter if the chip came to PCs - most of the performance enhancements is in Apple's software. The CPU itself is otherwise mediocre.
    ARM chips are also notorious for overclocking poorly.
     
  3. Strijker

    Strijker Active Member

    Messages:
    52
    Likes Received:
    12
    GPU:
    Gigabyte 3060Ti
    Really just software? Don't taking that part. C'mon mediocre, it's amazing. I rather bash Apple whenever I can but now I just can't. MS could make windows run on ARM cpu's, they already made a build for that in recent times if I'm not mistaken. But yea all the other software must be altered for that as well but when taking that road it's possible. Could take some time but Apple is more hardcore in big changes for sure.

    Yea ARM chips are mostly on phones, no good coolers as well and Apple main concern is silence, more over heat. Let it burn&throttle but let it be silent (if possible) when looking at the past history. But this thing is 15watt.
     
    Last edited: Mar 26, 2021
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,885
    Likes Received:
    1,135
    GPU:
    EVGA 1070Ti Black
    I hope leads to Desktop parts that use 15wattage with that performance on single core would amazing if it could be applied to multi core.

    Gonaa hope in one hand and crap in the other and see which fill up first :rolleyes:
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,563
    Likes Received:
    2,909
    GPU:
    HIS R9 290
    Yes, mostly software. I assume Apple basically just copied the instructions from Intel that they actually used, excluded the rest, and ended up with a chip that could do pretty much the same thing at 10% the wattage.
    Software is absolutely critical to how well something performs. I mentioned Clear Linux earlier as a real-world example of how much performance you can squeeze into a CPU if you actually care to optimize it. Apple's closed ecosystem makes it much easier for them to optimize things, because they don't have to retain compatibility with a wide array of platforms.

    Windows does run on ARM. I'm using the Lenovo C630 as I write this - an 8-core ARM-based laptop that comes with Windows 10. Windows gets the job done but Linux has better software support, and Mac is better optimized.
    EDIT:
    Windows also has a compatibility layer to run x86 software, but it's basically just emulation and runs like crap. Mac's Rosetta 2 seems to go a lot further, where you're not going to get native-level performance but it isn't horribly slow either. Linux takes a similar route to Windows but the driver layer is less complex so it seems to take less of a performance hit.

    Again, the M1 isn't any ordinary ARM chip. Even if you port Windows 10 to the new M1 Macs (which in theory should be possible, Linux is already bootable on them), you're not going to get the same level of performance, because Windows isn't built to be optimized.
     
    Last edited: Mar 26, 2021
    HandR likes this.
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,000
    Likes Received:
    856
    GPU:
    Inno3D RTX 3090
    A beefed up M1 Air is probably the best laptop I've used. Really responsive, zero lag on anything pretty much, ten hours plus of battery life.
    It also feels much faster than the 2020 13" Air, on clean big Sur installations. They have definitely done things about I/O with it, and as a package is great.
    A colleague has the M1 Mini at home, and he only praises it.

    The M1 is anything but mediocre. If anything, macOS is actually slower than Windows.
     
    moo100times and Undying like this.
  7. OldManRiver

    OldManRiver Member

    Messages:
    44
    Likes Received:
    23
    GPU:
    Nvidia
    Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
     
  8. Raserian

    Raserian Master Guru

    Messages:
    212
    Likes Received:
    124
    GPU:
    HD7750
    Care to explain more? I think that x86 has not really changed all that much since its invention and remains a CISC system.
     
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    13,421
    Likes Received:
    5,320
    GPU:
    GTX 1080ti

    To the user, x86 architecture is *very* CISC. Inside a chip it can be a different story.

    A modern high end Intel chip takes those CISC instructions and translates (decodes) them down into an internal RISC-like instruction set and executes those in parallel and can often “execute” (retire is Intel’s term for when an instruction completes execution) as many as 4 at a time (that was what they could do in 2015 on Haswell class machines).

    Today, Intel or AMD x86 processors do not execute x86, its instructions get decoded and translated into micro-instructions in the processor frontend; the backend executes those micro-instructions and looks a lot more like a RISC processor.
     
    Last edited: Mar 27, 2021
    geogan and Richard Nutman like this.
  10. fredgml7

    fredgml7 Member Guru

    Messages:
    142
    Likes Received:
    42
    GPU:
    Sapphire RX 570 4GB
    Specifically SOC (that implies not modular parts as it is nowadays on desktops) and non x86 like processors.
     
    Last edited: Mar 27, 2021

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,563
    Likes Received:
    2,909
    GPU:
    HIS R9 290
    You say that as though that changes anything discussed to any significant degree, because as Astyanax pointed out, the end result of many modern architectures is something that is very much CISC.
    By your logic, that's like saying "seaweed isn't a plant, it's a protist, so it's not a vegetable". The things that separate seaweeds from being plants don't separate it from being a vegetable.
    All that being said, MIPS is a true RISC architecture that is still fairly modern. Even today, the floating point instructions aren't integral to the architecture, and yet, MIPS can still run a full OS.

    Desktop and laptop PCs have become more and more of a SoC. For one big example, there is no longer a separate northbridge and southbridge. I wouldn't be surprised if within the next decade, there will no longer be a discrete motherboard chipset anymore. I think it'll still be a long while until RAM becomes fully integrated, like it is in many ARM platforms.
    As for non-x86 processors, I think that's very possible. Most tablets run on ARM and have obsoleted the need for desktops or laptops for most people. Apple is moving away from x86. Many tools we use have become cloud based, where it doesn't matter what architecture you use. The Chinese government is trying to move to MIPS. For Linux users, there's a rising interest in POWER, ARM, and RISC-V. MS's 2nd attempt at ARM is actually usable, and I'm sure they're just trying to prepare in the future in case ARM becomes a serious contender.
    So yeah, I think it's very possible x86 may lose quite a lot of popularity. It will basically just remain popular to those who want raw CPU processing power, and that's a dying trend thanks to things like OpenCL or CUDA.
     
    fredgml7 likes this.
  12. HARDRESET

    HARDRESET Master Guru

    Messages:
    844
    Likes Received:
    374
    GPU:
    1080Ti / 290X CFX
    Magic power of ASUS DARK HERO DOCS Capture54377.PNG
     

    Attached Files:

  13. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,780
    Likes Received:
    391
    GPU:
    RTX3080ti Founders
    I've seen loads of videos about the M1 in previews and post release. As a home recording enthusiast, it's very tempting, especially as more software is starting to support it (although still a while to go yet). I'm sure a year from now it will be very compelling. It runs emulated windows apps pretty decently too. I'll definitely be watching it's progress in the coming years.
     
  14. OldManRiver

    OldManRiver Member

    Messages:
    44
    Likes Received:
    23
    GPU:
    Nvidia
    You meant RISC.
    I said that because of ridiculous backward claims like that. I think it's funny you cite Sora, for once, clearly based on my post, got it more or less correct. You got it wrong. Look elsewhere to incite conflict. Thanks.
     
    Last edited: Mar 27, 2021
  15. rl66

    rl66 Ancient Guru

    Messages:
    3,395
    Likes Received:
    592
    GPU:
    Sapphire RX 6700 XT
    Yes most bench and program.
    Everyone were sceptical about it (don't forget that the 1st version is a tablet CPU) but the more they put it with more core on computer the more i like it...
    The good news is that they will make even more core and powerfull version.
    The bad point it is that it is Apple exclusive... :(
     
    Undying and PrMinisterGR like this.

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,000
    Likes Received:
    856
    GPU:
    Inno3D RTX 3090
    All this talk about CISC and RISC is completely irrelevant. All modern CPUs just translate incoming command streams to whatever they use internally.

    It's really a pity it's an Apple exclusive. I even see Apple going to RISC V in 20 years, just not to be in the mercy of Nvidia.
     
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,563
    Likes Received:
    2,909
    GPU:
    HIS R9 290
    I meant what I said.
    Who the hell is Sora?
    How can I be "more or less correct" while also getting it wrong?
    You're the one getting pedantic about what is actually RISC. If you don't want to incite conflict, you're barking up the wrong tree.
     
  18. OldManRiver

    OldManRiver Member

    Messages:
    44
    Likes Received:
    23
    GPU:
    Nvidia
    You're still arguing with me because? More or less correct was not addressed to you. I am not arguing "what is RISC" at all. Learn to read. You're still seeking conflict, duh.

    Well no, it is not actually irrelevant what the internal processor structure is. It it were, there would be no differences in uarch... 'cause it would be irrelevant'.
    Man... poor quality posting around here.
     
  19. Venix

    Venix Ancient Guru

    Messages:
    2,464
    Likes Received:
    1,207
    GPU:
    Palit 1060 6gb
    May i suggest you scream at em to get of your lawn ?
     
    schmidtbag and moo100times like this.
  20. kapu

    kapu Ancient Guru

    Messages:
    5,293
    Likes Received:
    713
    GPU:
    Radeon 6800
    Id say at least Zen2/3 there is, alot of head room for OC/UV and ram, tweaking. Much more than it was ever on intel side. Was quite fun for me :) Zen3 especially is most tweakable cpu in recent history
     

Share This Page