Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But here I am, with a pretty thin and very durable laptop that has a 6 core Xeon in it. It gets hot, it has huge fans, and it completely obliterates any M1 laptop. I don't mean it's twice as fast. I mean things run at 5x or faster vs an M1.

Probably not faster than an M1 Pro and definitely not faster than the M1 Max.

Your machine doesn't have a 512-bit wide memory interface running at over 400GB/s.

Does the Xeon processor in your laptop have 192KB of instruction cache and 24MB of L2 cache?

Every ARM instruction is the same size, enabling many instructions to be in flight all at once, unlike the x86-64 architecture where instructions vary in size and you can't have nearly as many instructions in flight at once.

Apples-to-apple: at the same chip frequency, an M1 has higher throughput than a Xeon and most any other x86 chip. This is basic RISC vs. CISC stuff that's been true forever. It's especially true now as increases in clock speeds has dramatically slowed and the only way to get significantly more performance is by adding more cores.

On just raw performance, I'd take the 8 high-performance cores in an M1 Pro vs. the 6 cores in your Xeon any day of the week and twice on Sunday.

And of course, when it comes to performance per watt, there's no comparison and that's really the story here.

Now, this is a new version of the M1, but it's an incremental 1-year improvement.

If you read AnandTech [1] on this, you'll see this is not the case—there have been huge jumps in several areas.

Incremental would have resulted in the same memory bandwidth with faster cores. And 6 high-performance cores vs. the 4 in the original M1.

Except Apple didn't do that—they doubled the number to 8 high-performance cores and doubled the memory width, etc. There were 8 GPU cores on the original M1 and how you can get up to 32!

Apple stated the Pro and the Max have 1.7x of the CPU performance of Intel's 8-core Core i7-11800H with 70% lower power consumption. There's nothing incremental about that.

By ditching Intel, what apple did is making sure their pro line - which is about power, not mobility, is no longer a competitor, and never will be.

Pro can mean different things to different people. For professional content creators, these new laptops are super professional. Someone could take off from NYC and fly all the way to LA while working on 16-inch MacBook Pro with a 120 MHz mini LED 7.7 million pixel screen that can display a billion colors in 4k or 8k video—battery only.

If you were on the same flight working on the same content, you'd be out of power long before you crossed the Mississippi while the Mac guy is still working. At half the weight of your laptop but a dramatically better display and performance when it comes to video editing and rendering multiple streams of HDR video.

The 16-inch model has 21 hours of video playback which probably comes in handy in many use cases.

Here's a video of a person using the first generation, 8 GB RAM M1 Mac to edit 8K video; the new machines are much more capable: https://youtu.be/HxH3RabNWfE.

[1]: https://www.anandtech.com/show/17019/apple-announced-m1-pro-...



Sorry but this entire post reads (skims) like you're playing top trumps with chip specs.


[flagged]


>arm cpus are faster than server-grade processors

They are way more efficient as server cores. https://aws.amazon.com/ec2/graviton/


if you define efficiency as compute per watt. I don't give a flying crap about watts. Efficiency is measured as amount of work done per hour. Because I get paid for the work, and then I pay the two dollars a week for the watts. I don't care if it's five dollars a week for the watts or two. I do care if it's two hours of waiting time versus five.


Compute per watt is compute per dollar. If you want more compute, spawn more cores. In this case, it will be cheaper with ARM.


lol no. the $20/month cost of electricity is a rounding error for my $6k laptop and the $50k of software licenses for it. It's even less of a rounding error for the datacenter, where a $500k ESX farm that has several million in software on it farm uses $5k of electric per per month including cooling.

Have you noticed almost no one uses ARM? There's a reason for that. Including software being licensed per core, so faster hotter cores and fewer of them win.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: