Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 26, 2021.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,541
    Likes Received:
    18,853
    GPU:
    AMD | NVIDIA
  2. rl66

    rl66 Ancient Guru

    Messages:
    3,931
    Likes Received:
    840
    GPU:
    Sapphire RX 6700 XT
    And don't forget that the graphical part of the M1 is above the GTX 1050 at worse... too bad it's Apple only.
     
    Keitosha and PrMinisterGR like this.
  3. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    It would also cost quite a bit more then any Intel or AMD processor being as massive as it is.

    M1 is a great cheap wide and big. And on 5nm.
     
  4. rl66

    rl66 Ancient Guru

    Messages:
    3,931
    Likes Received:
    840
    GPU:
    Sapphire RX 6700 XT
    On apple, the equivalent intel is more expensive and eat more energy than the M1.
     

  5. StevieSleep

    StevieSleep Member

    Messages:
    36
    Likes Received:
    6
    GPU:
    Nvidia GTX 1080
    It's great to see RISC type architectures getting some real love.
    After I saw that "Breaking the x86 Instruction Set" video I really understood why someone would want to move away from a CISC style architecture.
     
  6. Andy Watson

    Andy Watson Master Guru

    Messages:
    304
    Likes Received:
    177
    GPU:
    960
    Why are they running the Intel chips at not the single core max GHZ ? It should be over 5, not 3.xx
    What does the Apple chip do maximum GHz.
    How much does the Apple chip cost compared to Intel for someone buying it?

    Until those are known this is like comparing Oranges to, er, Apples...
     
  7. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    Is there anything else apple chip excel at beside passmark?
     
    moo100times, Airbud and SamuelL421 like this.
  8. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,517
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    I agree on your other points, but the price is not an issue because we are talking about an Apple product. It doesn't matter how much it costs (within reason), Apple's loyal customers will pay and think it's worth it. That's just how it goes.
     
    Undying likes this.
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,396
    GPU:
    Asrock 7700XT
    ARM hasn't been a true RISC architecture for several years now. Apple has taken it even farther away. Even POWER is hardly RISC at this point.

    Yes, and that's the weird thing - there are plenty of ways the M1 really impresses, and here were are looking a synthetic benchmark, as though that suggests anything meaningful. I wouldn't say the M1 would win against a 11700K in most tests, but it would fall shortly behind, which is impressive when you consider the difference in power draw.
     
  10. geogan

    geogan Maha Guru

    Messages:
    1,271
    Likes Received:
    472
    GPU:
    4080 Gaming OC
    Doesn't make sense - how can a chip perform same amount of "work" at only 15W ? If that's the case why can't desktop CPUs run at 15W? Or do 10x the amount of work at 150W
     

  11. zaku49

    zaku49 Guest

    Messages:
    2
    Likes Received:
    1
    GPU:
    1080ti
    It runs off a different architecture and instruction set then our normal Intel and AMD CPUs. For example the new apple chip is "ARM-based " which means it cannot run anything made using 32bit/64bit code. It would have to be done through emulation or completely rewriting the program. Since Apple isn't like Microsoft where they support a large amount of unique programs they can rewrite all of their stuff to run off ARMs. ARM-based processors are basically the future for portable devices, they run much cooler and consume less electricity.

    Intel is in big trouble here as they do not have anything to counter it, Chromebooks are also moving in this direction.
     
    Last edited: Mar 26, 2021
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,396
    GPU:
    Asrock 7700XT
    Same reason why game consoles can use mediocre hardware yet yield much better performance than the PC port of games (think more like PS2 and GC era). Or, consider how a few Arduinos are more powerful than the computer that brought the first astronauts to walk on the moon. Apple created hardware instructions tailor-made for their purposes. No transistor goes to waste, and they can minimize the amount of operations it takes to accomplish the same task.
    There are certain things the M1 would do worse, simply because it has fewer transistors at lower clock speeds. It's not necessarily that the CPU itself is better, but rather, all the software that takes advantage of it is so finely tuned. Though, x86-64 carries a lot of baggage for the sake of backward compatibility. This alone will make it substantially less efficient.
    In case you're wondering, it is possible to micro-optimize x86-64 software too. Take for example Clear Linux (which is developed by Intel), and for years has results ranging from 5% faster to more than twice as fast, using the exact same CPU. All they did was just optimize libraries and programs to use more instructions. Since AMD shares many of the same instructions, they too saw a performance increase, though it often wasn't as significant.
    As a result of Intel's efforts, more [open source] programs have baked in their optimizations. Really goes to show how much performance modern software is lacking, often because devs are too lazy.
    Doing this does increase peak wattage (because more transistors are being used per clock), but, the overall power consumption goes down because fewer clock cycles are needed to accomplish the same task.

    Not quite. ARM CPUs come in both 32 bit (armhf, armel, and anything older than armv8) and 64 bit (armv8) but they're not binary compatible with x86 or x86-64. You also don't have to rewrite anything, you just have to compile your code for whatever architecture you want to use. This is why open source OSes like FreeBSD, Haiku, or Linux can be found on various architectures with common programs you'd recognize like Firefox, VLC, or Blender. It's practically effortless to port software between architectures and OSes (which are also binary incompatible with each other). The hard part is micro-optimizing your software to compliment the hardware, which is what Apple did.
    Also, Apple very much does have a huge software collection, and much of their software is arguably more complex than much of what MS makes. The software Macs come with is actually useful, whereas much of what Windows comes with is the bare minimum and should be replaced with something better.
    Apple also has to worry about retaining backward compatibility with Intel Macs, which is what Rosetta 2 is for. Personally, I'm amazed Apple managed to squeeze so much performance in a compatibility layer like that. You're better off using an Intel CPU in such cases, but Rosetta 2 is a well-made transition.

    Where I definitely do agree is that Intel is in trouble. Their efforts for Linux are paying off but there's not much they can do to improve Windows, which is most of the desktop market. AMD is not really in any better of a situation either, but their performance-per-watt is currently better than Intel's.
     
    Airbud likes this.
  13. SamuelL421

    SamuelL421 Master Guru

    Messages:
    271
    Likes Received:
    198
    GPU:
    RTX 4090 / RTX 5000
    For now? It excels at synthetic benchmarks, Apple's first party macOS software, and getting unwarranted headlines about being "X"% faster.

    Don't get me wrong, I'm behind x86 alternatives all the way. But it's tiring to see so many headlines praising synthetic performance while so many applications still require Rosetta 2 and perform the same (or slower) than comparable Intel macs. The hardware is impressive but the software isn't quite there yet.
     
    moo100times and Undying like this.
  14. Fergutor

    Fergutor Member Guru

    Messages:
    113
    Likes Received:
    19
    GPU:
    MSI 1660 Super GX
    Wait wait...
    Tell me if this isn't an extremely biased way to put things:

    "Apple's M1 processor has been sighted in PassMark benchmarking software, compared to the Intel Core i7-11700K the single-core performance crown seems to go to Apple...."

    But the first in the table above the M1 is another intel of the same generation.
    An below the 11700k there are all the 5 Ryzen 5000 chips!!!

    And then :

    "Here's the remarkable thing, the i7-11700K has a TDP of 125W, the M1 15W for."

    When the same applies to all those below and above!!

    Seriously, why the obvious bias!?

    You could have said: "is above most Intel and all AMD offerings". But that would be fair and would put AMD on display and not blast exclusively Intel...which is first on the table!

    Terrible.
     
  15. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    So what would be the point in saying that when the 11700k is a smidge higher than the AMD counterpart? That would basically just be restating the obvious.
     

  16. golfdk

    golfdk Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    MSI RTX 2700 SUPER
    Do any body really believe that Apple engineered a new chip from the ground up... or just stealed a design and then trough lawyers at it. Apple d.....
     
  17. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    They did buy a semiconductor company years ago so....Not sure what you're trying to get at.
     
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,396
    GPU:
    Asrock 7700XT
    Neither statement is true. Apple didn't engineer their own chip from the ground up, but they didn't steal anything either. For one thing, they're licensed, but setting that aside, Apple added a lot of their own in-house instructions. That's where all the "magic" happens, because ARM is otherwise woefully mediocre in performance (though, it is great in performance-per-watt). The M1 isn't any ordinary ARM chip.
     
    mentor07825 likes this.
  19. Fergutor

    Fergutor Member Guru

    Messages:
    113
    Likes Received:
    19
    GPU:
    MSI 1660 Super GX
    No. And I think that I made myself clear enough.
     
  20. fredgml7

    fredgml7 Master Guru

    Messages:
    246
    Likes Received:
    87
    GPU:
    Sapphire RX 7600
    I find M1 very impressive, and I Wonder if that’s going to be a trend to home computing market.
    I would gladly change to RISC (or something more complex based on it) in the near future, but I despise soldered stuff (CPU, RAM etc.) not allowing upgrades, especially on Desktops but also notebooks/laptops.
    Coincidentally I started a discussion about Apple's M1 here :D: https://forums.guru3d.com/threads/let’s-talk-about-apple-m1.437313/#post-5899427
     

Share This Page