Asahi is awesome!
But this is also proves that laptops outside the MacBook realm really need to improve so much. I wish there were a Linux machine with the hardware quality of a MacBook
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points.
Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
I have no faith in Qualcomm to even make me basic gestures towards the Linux community.
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Facts are Linux support has heavily accelerated from both Qualcomm and Linaro on their behalf. Anyone who watches Linux ARM mailing lists can attest that.
Hardware has already been out for a year. Outside a custom spin by the ubuntu folks, even last years notebooks arent well supported out of the box on linux. I have a Yoga Slim 7x and I tried the Ubuntu spin out at some point - it required me to first extract the firmware from the Windows partition because Qualcomm had not upstreamed it into linux-firmware. Hard to take Qualcomm seriously when the situation is like this.
Qualcomm _does_ upstream all their firmware, but vendors usually require a copy signed with their keys burned into the SoC, so you need the very same firmware as provided by the vendor, otherwise it won't load. This is an actual security feature, believe it or not. Besides, chances are it wasn't even Qualcomm's firmware, but rather Cirrus for sound or display firmware or similar.
I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.
Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?
ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X
It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.
Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence
This sounds a lot like how AMD’s approach had changed on Linux and still everyone I know who wants to use their GPU fully used Nvidia. For a decade or more I’ve heard how AMD has turned over a new leaf and their drivers are so much better. Even geohot was going to undercut nvidia by just selling tinygrad boxes on AMD.
Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.
AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.
Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.
The closest laptop to MacBook quality is surprisingly the Microsoft Surface Laptop.
As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.
> x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance
Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?
Not in raw performance, no, but they're only beat out by i9s and the like, which are very power hungry. If you care even a little bit about performance per watt, the M series are far superior.
Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.
And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.
Yes, the M4 is still outperforming the desktop 9950X in single-threaded performance on several benchmarks like Geekbench and Cinebench 2024 [1]. Compared to the 9955HX, which is the same physical chip as the 9950X but lower clocked for mobile, the difference is slightly larger. But the 16 core 9950X is obviously much better than the base M4 (and even the 16 core M4 Max, which has only 12 P cores and 4 E cores) at multithreaded applications.
However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.
> Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
ohh thanks for that link; i was thinking about updating to the latest on my asusbook s15 but i think ill stick with the current ubuntu concept for now... saved me some trouble!
My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
FWIW if you buy new from Lenovo, getting a more high-res display has been an option for years.
I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.
I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.
> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.
I have a 14” FHD panel (158 dpi) on an old (7 year) laptop and there’s more issues with low resolution icons and paddings than with font rendering. I wouldn’t mind more, but it’s not blurry.
I just learned on Reddit the other day that people replace those screens with third party panels, bought from AliExpress for peanuts. They use panelook.com to find a compatible one.
Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.
Looking at a Thinkpad 16" P1 Gen 8 with 2X 1TB SSD, 64GB RAM, QHD+ screen, centered keyboard like MBP (i.e. no numpad), integrated Intel GPU, lightweight (4 lbs) for a little under $2.5K USD.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
I'm using T14s Gen 4 Intel and sleep works for me. I'm using it in clamshell mode connected to external display 99% of the time, so I don't really use sleep all the time, but the few times I tested it, it worked. Actually every hardware peripheral, including fingerprint sensor, worked out of the box. I was pleasantly surprised by that kind of support.
I've got a relatively new p16s with a hybrid Nvidia/Intel GPU, and a p14s gen 5 with an AMD GPU, and I was able to get both of them to suspend by closing the lid. Not sure if the issue you speak of is unique to the P1 or not, but all my ThinkPads have been decent with Linux.
I’ve had issues with T14s for a couple of gens where the machine wakes up during the closed lid and runs the battery down. I’ve tried the usual troubleshooting.
This has been a non issue on Dell machines for almost 20 years.
Oh some kernel params and other settings can help with that. These are mine, and it's been working great:
Kernel params
## Seems to be needed for suspend to S0 (s2idle) without hanging (only needed on p16s)
acpi_osi="Windows 2022"
# Prevent spurious wakeups from a firmware bug where the EC or SMU generates spurious "heartbeat" interrupts during sleep
acpi.ec_no_wakeup=1
# Prevents dock from waking up laptop right after suspend
usbcore.autosuspend=-1
Other settings (executed with a systemd service) (also only needed on p16s, not on my p14s)
I've moved completely to EliteBooks and am very happy with my decision. The build quality is superb, they're upgradeable, everything is replaceable and there's an excellent market and after market for parts, and HP has codepaths in their firmware for Linux support, meaning even Modern Standby works well.
Price points for refurb and used hardware are great, too.
I am giving my MacBook Air M2 15” to my wife and bought a Lenovo E16 with 120hz screen to run Kubuntu last night. She needed a new laptop and I am had enough of macOS and just need some stuff to work that will be easier on an intel and Linux. Also I do bookwork online so bigger screen and dedicated numpad will be nice.
It reviews well and seems like good value for money with current holiday sales but I don’t expect the same hardware quality or portability just a little more freedom. I hope I’m not too disappointed.
https://www.notebookcheck.net/Lenovo-ThinkPad-E16-G3-Review-...
If you're running desktop Linux, you will have a better experience with a rolling release than being stuck with whatever state the software that was frozen in Debian/Ubuntu is in, especially when it comes to multimedia, graphics, screen sharing, etc.
Modern desktop Linux relies on software that's being fixed and improving at a high velocity, and ironically, can be more stable than relying on a distro's fixed release cycles.
KDE Plasma, Wayland support, Pipewire, etc all have had recent fixes and improvements that you will not get to enjoy for another X months/years until Canonical pulls in those changes and freezes them for release.
Similarly, newer kernels are a must when using relatively recent hardware. Fixes and support for new hardware lands in new kernels, LTS releases might not have the best support for your newer hardware.
> can be more stable than relying on a distro's fixed release cycles
Stability for a distro means “doesn’t change” not “doesn’t crash”.
Debian/ubuntu are stable because they freeze versions so you can even create scripts to work around bugs and stuff and be sure that it will keep working throughout that entire release.
Arch Linux is not stable because you get updates every day or whatever. Maybe you had some script or patch to work around a bug and tomorrow it won’t work anymore.
This does not say _anything_ about crashing or bugs, except that if you find a bug/crash on a stable system then it is likely you can rely on this behaviour.
Agree. If you use a rolling release you definitely need a strategy for stability. I turn off automatic updates and schedule planned full updates that I can easily roll back from. I've had two breakages over the years that required snapper rollback. (Rolling back from a major distro upgrade isn't that easy)
It's a tradeoff that I'm happy with. I get to have a very up to date system.
That’s interesting comment. I didn’t think about that. I’ve only ever used Ubuntu flavours so I’ll search through what the popular rolling releases are out of interest.
Is this actually such a big point? I feel like (subjectively) on Ubuntu everything gets updated just as fast, and even if not, there's a new full release every 6 months. Or is this actually rather slow in comparison to Feroda?
I've also only used Debian based stuff my whole life and even moving from apt to dnf or whatever it was causes too much friction for me haha, though it's not that bad obviously, if I really would see the positives.
I outfitted our 10 person team with the E16 g2 and it’s been great.
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
That’s great news thanks. I got the gen 3 so maybe some improvements. Weight is ok as I really just move it around the house. I buy used Panasonics for the workshop.
Been a kubuntu user since .. 2006? 2007? Don't remember when kubuntu became a thing, but as soon as I tried Ubuntu, I went kubuntu. I believe it was 5.10 or 6.04 or something. :-)
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
I was a very long time debian user who got burned by Ubuntu and derivatives far too many times personally and professional. I moved to Fedora a few years back and it was a great decision. No regrets.
I liked Ubuntu and variants back when it first came out and I was newer to Linux but it didn't take long for me to realise there always seemed to be a better option for me as a daily driver. To me its like an new Linux user OS where a lot of stuff is chosen for you to use basically as is. Even the name Kubuntu where the K is for KDE but on other distros you would just choose your DE when you install.
I agree. It feels like combination of peak windows UI with the ease of Ubuntu baked in. Then the little mobile app they have that gives you shared clipboard with iOS is cool.
> I wish there were a Linux machine with the hardware quality of a MacBook
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
Notice how much the screen wobbles after opening the laptop, around the one minute mark. That does not happen even with the cheapest Macbook Air, that’s the kind of design quality people refer to.
As for light and sturdy, the Netbook era had it all. A shame the world moved on from that.
I've owned two LG gram laptops. Neither were milspec, but both were really nice. Sure, the screen quality isn't going to win any awards, nor will the speakers, but the light weight, fantastic battery life and snappy performance always get a recommendation from me.
I adore my Linux setup and have switched back to it after using M1 Pro for 3 years.
But through all the Dells, Thinkpads and Asus laptops I've had (~10), none were remotely close to a full package that MBP M1 Pro was.
- Performance - outstanding
- Fan noise - non-existent 99% of the time, cannot compare to any other laptop I had
- Battery - not as amazing as people claim for my usage, but still at least 30% better
- Screen, touchpad, speakers, chassis - all highest tier; some PC laptops do screen (Asus OLED), keyboard and chassis (Thinkpad) better, but nothing groundbreaking...
It's the only laptop I've ever had that gave me a feeling that there is nothing that could come my way, and I wouldn't be able to do on it, without any drama whatsoever.
It's just too bad that I can't run multiple external displays on Asahi...
(For posterity, currently using Asus Zenbook S16, Ryzen HX370, 32GB RAM, OLED screen, was $1700 - looks and feels amazing, screen is great, performance is solid - but I'm driving it hard, so fan noise is constant, battery lasts shorter, and it's just a bit more "drama" than with MBP)
Excellent power efficiency in apple silicon - good battery life and good performance at the same time. The aluminum body is also very rigid and premium feeling, unlike so many creaky bendy pc laptops. Good screen, good speakers.
Aluminum and magnesium non-Apple laptops are just as stiff. There's just a wider spectrum of options, including $200 plastic ARM Chromebooks available.
I’ve never heard someone describe the aluminum body as bad.. what do you not like about it?
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
The keyboard and body are not bad at all - rather, they're best in class, and so is the rest of the hardware. It is a premium hardware experience, and has been since Jony Ive left, which is what makes the software so disappointing.
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
I just turn off trackpads, I'm not interested in that kind of input device, and any space dedicated to one is wasted to me. I use nibs exclusively (which essentially restricts me to Thinkpads).
My arms rest on the body, the last thing I want is for it to be a material that leeches heat out of my body or that is likely to react with my hands' sweat and oils.
Strawman. Because Apple designed it well. Metal’s not an issue. My legacy 2013 MacBook Air still looks and feels and opens like new.
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple
MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
If the Macbook has a bad keyboard (ignoring the Butterfly switches, which aren't on any of the M series machines, which are the ones people actually recommend and praise), then the vast majority of Windows machine have truly atrocious keyboards. I prefer the keyboard on my 2012 Macbook to the newer ones, but it's still better than the Windows machines I can test in local stores.
I prefer the aluminium to the plastic found on most Windows machines. The Framework is made from some aluminium alloy from what I know, and I see that as a good thing.
The soldered RAM sucks, but it's a trade-off I'm willing to make for a touchpad that actually works, a pretty good screen, and battery life that doesn't suck.
> "I never understood why people claim the Macbook is so good."
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
Hey! I actually wrote a thing to make the Swaybar a little more "complete" (e.g. battery status, currently selected program, clock, inspirational quote from ChatGPT, etc): https://git.sr.ht/~tombert/swaybar3
Not going to claim it will change the world or anything, but this runs perpetually with Sway and according to System Monitor it hovers at a little less than a megabyte of RAM. You can set how often you want things to update, and add as many sections as you'd like, and it's easy to create extra modules if you are so inclined (though not as easy as the Clojure version since I haven't found an implementation of multimethods for Rust that I like as much).
Is there a difference from 2024? Is the M2 still a good choice for Linux? I don’t mind older generations, I’m used to be a bit behind in terms of hardware as a tradeoff for good Linux support.
I used to enjoy the X line of ThinkPads but nowadays I don’t see a point going for them anymore, as the things I appreciated about them are slowly being phased out.
There is no support really for Linux on the M3+, nor should anyone expect the situation to change now that the main devs have moved on.
If you would be happy with a M1/M2 laptop knowing full well that it is a dead end and you will never have another Mac laptop with Linux support (the default assumption at this point), then yes it is a great machine.
How confident are you in this statement? I have no particular knowledge of Asahi. But I do know this narrative emerged about Rust-for-Linux after a couple of high-profile individuals quit.
In that case it was plainly bogus but this was only obvious if you were somewhat adjacent to the relevant community. So now I'm curious if it could be the same thing.
(Hopefully by now it's clear to everyone that R4L is a healthy project, since the official announcement that Rust is no longer "experimental" in the kernel tree).
I know Asahi is a much smaller project than R4L so it's naturally at higher risk of losing momentum.
I would really love Asahi to succeed. I recently bought a Framework and, while I am pretty happy with it in isolation... when I use my partner's M4 Macbook Air I just think... damn. The quality of this thing is head and shoulders above the rest of the field. And it doesn't even cost more than the competition. If you could run Linux on it, it would be completely insane to use anything else.
I've been an Asahi user since the early stages of the project when it used Arch. Today, I run Fedora Asahi Remix on a Mac Studio M1 Ultra with the Sway desktop, and it truly has been the perfect Linux workstation in every way.
>I am very impressed with how smooth and problem-free Asahi Linux is. It is incredibly responsive and feels even smoother than my Arch Linux desktop with a 16 core AMD Ryzen 7945HX and 64GB of RAM.
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
Each controller and subcomponent on the motherboard needs a driver that correctly puts it into low power and sleep states to get battery savings.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
One of my favorite machines was the MacBook Air 11 (2012). This was a pure Intel machine, except for a mediocre Broadcom wireless card. With a few udev rules, I squeezed out the same battery performance from Linux I got from OS X, down to a few minutes of advantage in favor of Linux. And all this despite Safari being a marvel of energy efficiency.
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
The need to tweak that deeply just to get “baseline” performance really stings, though, particularly if you’re not already accustomed to having to do that kind of thing.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
> 40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
Out of curiosity, does Google contribute the drivers for Chromebook hardware to Linux upstream or do they keep it for themselves? Could it be that they just choose the hardware that works very well out of box with Linux?
I have no idea if there's upstreaming, but the Chomium OS repo is open source so you could check.
I don't know if that would help the wider Linux laptop community, because Chromebook OEMs can only select from a small list of CPU & chipset hardware combinations blessed by Google
What's the bar here? My Thinkpad X270 gets about 16 hours under Ubuntu with swaywm.
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
Yet. Plenty of people have with Intel ones - I’m one of them. My first experience with Linux was on a 2016 MBpro. And inevitably people will do the same with the silicon Macs, likely using Asahi it seems.
It's not inevitable. That's not what that word means.
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
It is inevitable. I guarantee you there will be people who run Linux on their silicon Macs. I don’t know how you could possibly hold a stance that no one ever will.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
It's not inevitable. It's fragile. Go boot up your old iPad; that should be well-studied, right? We ought to know how to boot into Linux on an ARM machine that old, it's only fair.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
Apple cannot lockdown the Mac. You can’t have a development machine that is incapable of running arbitrary code. Back when they still did WWDC live they said that software development was the biggest professional bloc of Mac users. I’m certain that these days development is the biggest driver of the expensive Macs. No one has ever made a decent argument as to why Apple would lock down the Mac that would also explain why they haven’t done it yet.
Passivity isn’t hostility. There isn’t any evidence that Apple is considering locking down the Mac. They could have easily done that with the transition to their own silicon but they didn’t despite the endless conspiracy theories.
That's an admirable goal, but, depending on the hardware, it can run into that pesky thing called reality.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I've never gotten along too well with virtualization, but would second the ThinkPad idea, or something similar. Old/cheap machine for tinkering is a good way to ease in, and I think bare metal feels more friendly.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
I've tried this once for IntelliJ to work around slow WSL access for Git repos. Was greeted by missing fonts and broken scaling on the intro screen. Oops. But probably I was just unlucky, it might work well for most.
It's a common use-case for x86 machines that implement UEFI. Taking the iPhone and iPad into account, it is a nonexistent use-case for mobile ARM chipset owners.
Apple does tons of optimizations for every component to improve battery life.
Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.
Eh it's pretty awful. I get 8 hours, yes, but in Linux, those 8 hours are ticking whether my laptop is sleeping in my bag or on my desk with the lid closed or I'm actively using it. 8 hours of active use is pretty good, but 8 hours in sleep is absolutely dreadful.
Exactly. This myth keeps being perpetuated, for some reason.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
For optimal battery life you need to tweak the whole OS stack for the hardware. You need to make sure all the peripherals are set up right to go into the right idle states without causing user-visible latency on wake-up. (Note that often just one peripheral being out of tune here can mess up the whole system's power performance. Also the correct settings here depend on your software stack). You need to make sure that cpufreq and cpuidle governors work nicely with the particular foibles of your platform's CPUs. Ditto for the task scheduler. Then, ditto for a bunch of random userspace code (audio + rendering pipeline for example). The list goes on and on. This work gets done in Android and ChromeOS.
This doesn't match my experience. My previous three laptops (two AMD Lenovo Thinkpads, one Intel Sony VAIO) had essentially the same battery life running Linux as running Windows.
I also have an X13 Gen2 AMD. My idle power consumption is 2.5W to 4W depending on brightness. This ends up in 12h-15h (machine/battery ist 2y old I think).
Why? Lots of people more or less use their computer as a glorified web browser, with some zoom calls and document editing thrown in for good measure. 256gb seems overkill. My girlfriend is somehow still rocking a 2011 MacBook Air. She mostly just uses it for internet banking and managing her finances. Why would she want more than 256gb?
1Tb m.2 SSD cost 70 USD in summer 2025, and probably much less when bought in bulk as a chip. It doesn't make sense to install anything less than 1Tb in an expensive premium laptop. Or it should be upgradeable.
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
Because the price tag is quite high to get as much storage as you would 15 years ago for about the same money.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Possibly, but I don't see why those people would buy a new MacBook rather than a used 100$ laptop (which would be both better for their finances but also for the planet...)
Have you ever used windows on a $100 second hand laptop?
Imagine for a second that you don't know much about computers. You buy something crap like that and turn it on. Windows is of course already installed. Along with 18 antivirus programs and who knows what other junk. The computer will run dog slow. Even if you get rid of all the preinstalled programs, it'll run horribly slowly.
My mum has a computer from her work. Its pretty recent - worth way more than $100. It takes about 5-10 seconds for zoom or google chrome to start. And about 15 seconds for outlook to open. Its an utterly horrible experience.
If you can afford it, you'll have a way better experience on a macbook air from the last few years. In comparison, everything starts instantly. The experience is fantastic. Premium, even.
Personally I think its criminal that cheap laptops run modern software so poorly. Its just laziness. There's no reason for the experience to be so horrible. But the world being what it is, there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it. Even if you don't do much with the computer.
I don’t know, I kind of like 10 hrs on battery with normal usage and screen fully lit on a 15” screen while not being bulky. Virtually no contenders in that space.
I think they mean that is 2025, 256GB is unreasonably small. Which is true, Apple wants to up-charge hundreds of dollars just to get to the otherwise standard 1TB drive.
From a supply perspective, 256GB seems ridiculous because you can get way more capacity for not very much money, and because 256GB is now nowhere close to enough flash chips operating in parallel to reach what is now considered high performance.
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
A new Wayland protocol is in the works that should support screen cutout information out of the box: https://phosh.mobi/posts/xdg-cutouts/ Hopefully this will be extended to include color information whenever applicable, so that "hiding" the screen cutout (by coloring the surrounding area deep black) can also be a standard feature and maybe even be active by default.
You can't be serious. Wayland is the opposite of modular, and the concept of an extensible protocol only creates fragmentation.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
Shh...you're not supposed to mention these things alas you be down voted to death.
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
YMMV and all, but my experience is that Wayland smoothness varies considerably depending on hardware. On modernish Intel and AMD iGPUs for example I’ve not had much trouble with Wayland whereas my tower with an Nvidia 3000 series card was considerably more troublesome with it.
Generally true, though this particular case is due to a single company deciding to not play ball and generally act in a manner that's hostile to the FOSS world for self-serving reasons (Nvidia).
I don't even think it's even that. These bugs seem like bog standard bugs related to correct sharing of graphics resources between processes and accessing with correct mutual exclusion.Blaming NV is likely just a convenient excuse.
> my tower with an Nvidia 3000 series card was considerably more troublesome with it.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
The thing is that I'm not experiencing this clipboard issue on Plasma, but on a fresh installation of Void Linux with niri. There are reports of this issue all over[1][2][3], so it's clearly not an isolated problem. The frustrating thing is that I wouldn't even know which project to report it to. What a clusterfuck.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
Far from it. The recent XLibre release[1] has a long list of bugfixes and new features.
Besides, isn't the main complaint from the Wayland folks that X11 is insecure and broken? That means there's still a lot of work to be done. They just refuse to do it.
To be fair, X11 has worked great for me for the past ~20 years, but there are obvious improvements that can be made.
Because one property doesn't guarantee the other. A modular system may imply that it can be extended. An extensible system is not necessarily modular.
Wayland, the protocol, may be extensible, but the implementations of it are monolithic. E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on. I'm stuck with whatever my compositor and applications support. This is the opposite of modularity.
So all this protocol extensibility creates in practice is fragmentation. When a compositor proposes a new protocol, it's only implemented by itself. Implementations by other compositors can take years, and implementations by client applications decades. This is why it's taken 18 years to get close to anything we can refer to as "stable".
You can see the same problem in the XMPP world, with a lot of the extensions implemented only by a few applications. But at least most XMPP extensions are designed to be backwards-compatible with clients that don't support them.
You know what OS doesn’t handle the notch? OSX. It happily throws the system tray icons right back there, with an obscure work around to bring them back. Software quality at Apple these days…
The idea that a group of people would spend so much of their time trying to get linux to work on Apple hardware through reverse engineering always seemed absolutely crazy to me. I would never consider buying Apple hardware precisely because it doesn't support linux and the work they put in achieves nothing because the risk will always remain that they will lock the hardware further. Nevermind the fact that they will likely never fully reverse engineer all the components.
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
You overlooked the UTM app on the App Store (and open source available too), which wraps Apple Silicon virtualization excellently, or you can use Qemu (which I don't).
I used to use Asahi, but the sleep modes power drain was tedious.
With UTM, I install a latest Fedora ISO (declaring it a "Linux", which exposes the option to skip QEMU and use native Apple Silicon virtualization.
It's fantastic. I mention this only because it's been super useful, way better than Asahi, with minimal effort.
Asahi is all reverse engineering. It’s nothing short of a miracle what has already accomplished, despite, not because of, Apple.
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
To be clear, the work the asahi folks are doing is incredible. I’m ashamed to say sometimes their documentation is better than the internal stuff.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
This is what ruffles my jimmies about this whole thing:
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
I strongly support a projects stance that you shouldn't ask when it will be done. But the time between the M1 launch and a good experience was less than the time since M3 I would love to know what is involved.
That's an email from James Calligeros. All this patch says is that the author is Hector Martin (and Sven Peter). The code could have been written a long time ago.
The new project leadership team has prioritized upstreaming the existing work over reverse engineering on newer systems.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
Hard disagree : try the UTM app on the App Store (or build it from open source) and you get Apple Silicon native virtualization and super simple installation of Aarch64 linuxes from an iso.
i've been doing this for maybe a year, after frustration with power draw and sleep modes (and dual boot) with Asahi.
it's been great...and Apple silicon is still super efficient, which is why i said hard disagree.
Given the speed of the progress that Apple has made on their hardware (from M1 to M5), I think the project was already doomed since the very beginning. Reverse engineering per-se is a huge talent drain that wastes tremendous amount of man-hour on a closed problem. Also, the strong SW-HW integration of Mac is sophisticate and fragile, that is difficult to analyze and replicate. Nailing all those details is not only time consuming, but also limited in the scope, and never yield anything beyond status quo.
I’m quite glad that those talented guys finally escaped from the pit hole of reverse engineering. It maybe fun and interesting, but its future was already capped by Apple. I wish they find another fashion, hopefully something more original and progressive. Stop chasing and push forward.
Very little progress made this year after high profile departures (Hector Martin, project lead, Asahi Lina and Alyssa Rosenzweig - GPU gurus). Alyssa's departure isn't reflected on Asahi's website yet, but it is in her blog. I believe she also left Valve, which I think was sponsoring some aspects of the Asahi project. So when people say "Asahi hasn't seen any setbacks" be sure to ask them who has stepped in to make up for these losses in both talent and sponsorship.
I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.
Without official support, the Asahi team needs to RE a lot of stuffs. I’d expect it to lag behind a couple of generations at least.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
M1 is indeed quite adequate for most, but each generation has brought substantial boosts in performance in single-threaded, multi-threaded, and with the M5 generation in particular GPU-bound tasks. These advancements are required to keep pace with the industry and in a few aspects stay ahead of competitors, plus there exist high end users whose workloads greatly benefit from these performance improvements.
I agree. But Apple doesn’t sell new M1 chip laptops anymore AFAIK. There are some refurbished ones but most likely I need to go into a random store to find one. I only saw M4 and M5 laptops online.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In the US, Walmart is still selling the M1 MacBook Air new, for $599 (and has been discounted to $549 or better at times, such as Black Friday).
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
> That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
If you want the latest and greatest you can get it. If an M1 is fine you can get a great deal on one and they’re still great machines and supported by Apple.
author mentions he paid $750 for a MacBook Air M2 with 16GB while on Amazon a M4 Air with 16GB is usually $750-800. I get it that M4/M3 aren't supported to boot Asahi yet, but still.
I really wanted this to work, and it WAS remarkably good, but palm rejection on the (ginormous) Apple trackpad didn't work at all, rendering the whole thing unusable if you ever typed anything.
That was a month ago, this article is a year old. I'd love to be wrong, but I don't think this problem has been solved.
Yeah what is up with that? When I've tried to look into it I've just been met with statements that palm rejection should pretty much just work, but it absolutely doesn't and accidental inputs are so bad it's unusable without a disable/enable trackpad hotkey.
All Firefox users should switch to librewolf. In the short term it’s for telling Mozilla to go f**, in the long term it’s a browser fork with with really good anti fingerprinting.
Note that librewolf rely on Mozilla tech infra for account synchronization and plugin distribution. If you are truly hostile to this organization, is there another browser you can recommend?
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points. Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
[1] https://www.phoronix.com/review/snapdragon-x-elite-linux-eoy...
[2] https://news.ycombinator.com/item?id=46375174
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Things have definitely changed, a lot.
I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.
Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?
ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X
It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.
Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence
Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.
AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.
As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.
Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?
Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.
And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.
https://browser.geekbench.com/v6/cpu/singlecore
However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.
[1] https://nanoreview.net/en/cpu-compare/apple-m4-8-cores-vs-am...
Installed arch, setup some commands to underclock the processor on login and easily boost it when I'm compiling.
Battery life is great but I'm not running a GUI either. Good machine for when I want to avoid distractions and just code.
I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.
I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.
> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
I’ve had issues with T14s for a couple of gens where the machine wakes up during the closed lid and runs the battery down. I’ve tried the usual troubleshooting.
This has been a non issue on Dell machines for almost 20 years.
Kernel params
Other settings (executed with a systemd service) (also only needed on p16s, not on my p14s)I've moved completely to EliteBooks and am very happy with my decision. The build quality is superb, they're upgradeable, everything is replaceable and there's an excellent market and after market for parts, and HP has codepaths in their firmware for Linux support, meaning even Modern Standby works well.
Price points for refurb and used hardware are great, too.
[1] https://ubuntu.com/certified
Modern desktop Linux relies on software that's being fixed and improving at a high velocity, and ironically, can be more stable than relying on a distro's fixed release cycles.
KDE Plasma, Wayland support, Pipewire, etc all have had recent fixes and improvements that you will not get to enjoy for another X months/years until Canonical pulls in those changes and freezes them for release.
Similarly, newer kernels are a must when using relatively recent hardware. Fixes and support for new hardware lands in new kernels, LTS releases might not have the best support for your newer hardware.
Stability for a distro means “doesn’t change” not “doesn’t crash”.
Debian/ubuntu are stable because they freeze versions so you can even create scripts to work around bugs and stuff and be sure that it will keep working throughout that entire release.
Arch Linux is not stable because you get updates every day or whatever. Maybe you had some script or patch to work around a bug and tomorrow it won’t work anymore.
This does not say _anything_ about crashing or bugs, except that if you find a bug/crash on a stable system then it is likely you can rely on this behaviour.
It's a tradeoff that I'm happy with. I get to have a very up to date system.
The problem with Ubuntu, as other mentioned, is that you get ancient version of some packages. Fedora is nicely up to date.
I've also only used Debian based stuff my whole life and even moving from apt to dnf or whatever it was causes too much friction for me haha, though it's not that bad obviously, if I really would see the positives.
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
Are you running windows?
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
https://youtu.be/herYV5TJ_m8
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
As for light and sturdy, the Netbook era had it all. A shame the world moved on from that.
Bad keyboard, bad aluminium body, soldered ram...
Is it just the Apple Silicon that somehow makes it worth it? It's ARM, most software is still written and optimized for x86.
But through all the Dells, Thinkpads and Asus laptops I've had (~10), none were remotely close to a full package that MBP M1 Pro was.
- Performance - outstanding
- Fan noise - non-existent 99% of the time, cannot compare to any other laptop I had
- Battery - not as amazing as people claim for my usage, but still at least 30% better
- Screen, touchpad, speakers, chassis - all highest tier; some PC laptops do screen (Asus OLED), keyboard and chassis (Thinkpad) better, but nothing groundbreaking...
It's the only laptop I've ever had that gave me a feeling that there is nothing that could come my way, and I wouldn't be able to do on it, without any drama whatsoever.
It's just too bad that I can't run multiple external displays on Asahi...
(For posterity, currently using Asus Zenbook S16, Ryzen HX370, 32GB RAM, OLED screen, was $1700 - looks and feels amazing, screen is great, performance is solid - but I'm driving it hard, so fan noise is constant, battery lasts shorter, and it's just a bit more "drama" than with MBP)
A modern M4 should tho
I am very much a Linux person. But the battery life with macOS on the Apple Silicon is absolutely insane.
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
Would you elaborate ?
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
it's a stylistic choice, not a logical one.
That alone is already very compelling for me (no noise, no fan to wear out). Then on top of that it has:
* Amazing battery life
* Great performance
* The best trackpad in the world
* Bright, crisp screen
The only downsides are the lack of upgradability and the annoying OS, but at least it's UNIX.
My arms rest on the body, the last thing I want is for it to be a material that leeches heat out of my body or that is likely to react with my hands' sweat and oils.
"...It's just a flesh wound..."
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
Not sure about anything else, have ONLY used those.
I prefer the aluminium to the plastic found on most Windows machines. The Framework is made from some aluminium alloy from what I know, and I see that as a good thing.
The soldered RAM sucks, but it's a trade-off I'm willing to make for a touchpad that actually works, a pretty good screen, and battery life that doesn't suck.
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
Not going to claim it will change the world or anything, but this runs perpetually with Sway and according to System Monitor it hovers at a little less than a megabyte of RAM. You can set how often you want things to update, and add as many sections as you'd like, and it's easy to create extra modules if you are so inclined (though not as easy as the Clojure version since I haven't found an implementation of multimethods for Rust that I like as much).
I used to enjoy the X line of ThinkPads but nowadays I don’t see a point going for them anymore, as the things I appreciated about them are slowly being phased out.
If you would be happy with a M1/M2 laptop knowing full well that it is a dead end and you will never have another Mac laptop with Linux support (the default assumption at this point), then yes it is a great machine.
How confident are you in this statement? I have no particular knowledge of Asahi. But I do know this narrative emerged about Rust-for-Linux after a couple of high-profile individuals quit.
In that case it was plainly bogus but this was only obvious if you were somewhat adjacent to the relevant community. So now I'm curious if it could be the same thing.
(Hopefully by now it's clear to everyone that R4L is a healthy project, since the official announcement that Rust is no longer "experimental" in the kernel tree).
I know Asahi is a much smaller project than R4L so it's naturally at higher risk of losing momentum.
I would really love Asahi to succeed. I recently bought a Framework and, while I am pretty happy with it in isolation... when I use my partner's M4 Macbook Air I just think... damn. The quality of this thing is head and shoulders above the rest of the field. And it doesn't even cost more than the competition. If you could run Linux on it, it would be completely insane to use anything else.
https://github.com/jasoneckert/sway-dotfiles/blob/main/Asahi...
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
For those curious about the Alkeria line-scan camera, he wrote a blog about 3d printing a lens mount etc. https://daniel.lawrence.lu/blog/2024-08-31-customizing-my-li...
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
It's generally the most optimized system down to the fact that Apple controls everything about it's platform.
If that's considered baseline, then nothing but full vertical integration can compete
I agree that in case of Linux, a udev rule generator would be a fantastic step ahead in terms of usability.
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
I don't know if that would help the wider Linux laptop community, because Chromebook OEMs can only select from a small list of CPU & chipset hardware combinations blessed by Google
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
I see how the GP comment could be provocative, but on this site we want responses that dampen provocation, not amplify it.
For a lot of people the point is to extend the life of their already-purchased hardware.
If your vendor is hostile like Apple, it will be hard to make it keep on working.
Why are some of y'all so hostile to this idea?
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
Apple cannot lockdown the Mac. You can’t have a development machine that is incapable of running arbitrary code. Back when they still did WWDC live they said that software development was the biggest professional bloc of Mac users. I’m certain that these days development is the biggest driver of the expensive Macs. No one has ever made a decent argument as to why Apple would lock down the Mac that would also explain why they haven’t done it yet.
Passivity isn’t hostility. There isn’t any evidence that Apple is considering locking down the Mac. They could have easily done that with the transition to their own silicon but they didn’t despite the endless conspiracy theories.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
2. If your priority is system lifespan, you are already using OEM macOS.
2. By all means start with macOS, but eventually Apple will stop supporting your machine. And y'know what will still work and get updates then? Linux.
Which old hardware? You're circling around to the grandparent's point again; Linux support is hardware dependent.
> And y'know what will still work and get updates then?
No, I don't. Depreciated iPads lay dead in piles, and they don't run Linux for shit. You want me to believe the M4 will graduate to the big leagues?
Every thread about Linux inevitably someone says “it gave new life to my [older computer model].” We’ve all seen it countless times.
This post is about the MacBook Air M2. The discussion has been about silicon MacBooks - laptops - from the start.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
I think you can improve your power settings.
[1] https://wiki.archlinux.org/title/TLP
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Imagine for a second that you don't know much about computers. You buy something crap like that and turn it on. Windows is of course already installed. Along with 18 antivirus programs and who knows what other junk. The computer will run dog slow. Even if you get rid of all the preinstalled programs, it'll run horribly slowly.
My mum has a computer from her work. Its pretty recent - worth way more than $100. It takes about 5-10 seconds for zoom or google chrome to start. And about 15 seconds for outlook to open. Its an utterly horrible experience.
If you can afford it, you'll have a way better experience on a macbook air from the last few years. In comparison, everything starts instantly. The experience is fantastic. Premium, even.
Personally I think its criminal that cheap laptops run modern software so poorly. Its just laziness. There's no reason for the experience to be so horrible. But the world being what it is, there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it. Even if you don't do much with the computer.
Realistically, it is reasonable to expect 2TB drives, based on normal progression https://blocksandfiles.com/2024/05/13/coughlin-associates-hd...
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
[1]: https://bugs.kde.org/show_bug.cgi?id=466041
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
If my Ferrari has an issue with the brakes and I go to my dealer I don't care if the brakes were by Brembo.
Blaming the vendor and their drivers is just trying to shift the blame.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
[1]: https://old.reddit.com/r/hyprland/comments/1d4s9bw/ctrlc_ctr...
[2]: https://old.reddit.com/r/tuxedocomputers/comments/1i9v0n7/co...
[3]: https://old.reddit.com/r/kde/comments/1jl6zv7/why_does_copyp...
Besides, isn't the main complaint from the Wayland folks that X11 is insecure and broken? That means there's still a lot of work to be done. They just refuse to do it.
To be fair, X11 has worked great for me for the past ~20 years, but there are obvious improvements that can be made.
[1]: https://github.com/X11Libre/xserver/releases/tag/xlibre-xser...
Wayland, the protocol, may be extensible, but the implementations of it are monolithic. E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on. I'm stuck with whatever my compositor and applications support. This is the opposite of modularity.
So all this protocol extensibility creates in practice is fragmentation. When a compositor proposes a new protocol, it's only implemented by itself. Implementations by other compositors can take years, and implementations by client applications decades. This is why it's taken 18 years to get close to anything we can refer to as "stable".
Bloody Wayland.
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
I used to use Asahi, but the sleep modes power drain was tedious.
With UTM, I install a latest Fedora ISO (declaring it a "Linux", which exposes the option to skip QEMU and use native Apple Silicon virtualization.
It's fantastic. I mention this only because it's been super useful, way better than Asahi, with minimal effort.
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
https://randomaugustine.medium.com/on-apple-exclaves-d683a2c...
As always, opinions are my own
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
For instance, in this month's progress report:
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
https://asahilinux.org/2025/12/progress-report-6-18/
Stop buying Apple laptops to run Linux.
i've been doing this for maybe a year, after frustration with power draw and sleep modes (and dual boot) with Asahi.
it's been great...and Apple silicon is still super efficient, which is why i said hard disagree.
I’m quite glad that those talented guys finally escaped from the pit hole of reverse engineering. It maybe fun and interesting, but its future was already capped by Apple. I wish they find another fashion, hopefully something more original and progressive. Stop chasing and push forward.
https://rosenzweig.io/blog/asahi-gpu-part-n.html
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
Asahi Lina, who also did tons of work on the Asahi Linux GPU development, also quit as she doesn't feel safe doing Linux GPU work anymore [1].
[0] https://marcan.st/2025/02/resigning-as-asahi-linux-project-l...
[1] https://asahilina.net/luna-abuse/
They are more common than you would think. There just is not many willing to work on a shoe string salary.
You explained it well by yourself.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
I've got a few ideas