I switched all the machines at https://lanparty.house over to Linux a couple months ago. So far, we've experienced noticeably fewer problems on Linux compared to Windows. Stability and performance are better. I can't think of one game we tried that didn't work. And wow is it nice not to have all the ads and crapware in our faces anymore.
(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
On a similar note, performance is sometimes better. As a direct comparison, the steam version of the Lenovos Legion S handheld is significantly more performant than the windows version. Like 20% better FPS and double the battery life. Literally the only difference between the two is the OS.
Though from what I've read, Microsoft could fix that relatively quickly, if they made some tweaks to Windows (and called it a special 'handheld gaming edition' or so).
For some reason, the Lenovo Legion S's Windows still comes with a lot of baggage and background services etc.
If LTT is to be believed, this is in the works
Maybe SteamOS managed to ruffle enough feathers to start moving the inertial colossus that is Microsoft, not that I have much trust on their willingness to leave a good idea remain good in the long term
As an aside..
I went down a mini-rabbit hole learning about the LAN Party House, read your website and about Sandstorm[0] and how that ended up with you at Cloudflare leading Workers. That’s a really cool and honestly inspirational path. Would love to learn more if you’ve written elsewhere…!
I was also impressed by his wife's Chez JJ work. I suspect that she has done much more impressive stuff, but that kind of thing is a dime a dozen, in SV. The hacker housing stuff speaks to her humanity, and I like humans.
Yes, on Linux I was able to move the copy-on-write overlays to use local disks, which is one reason it performs much better (admittedly not a reason that would affect most people).
I am just using dm-snapshot for this -- block device level, no fancy filesystems.
Yep, I've been gaming exclusively on Ubuntu (mainly because I want my desktop to match my servers) for several years. If you aren't playing the latest AAA FPS, then everything pretty much works.
I also game on Ubuntu and snaps have never been in my way. I actually like them and wish more non-game software was distributed this way, but Canonical has a brown thumb when it comes to growing their weird little side projects.
Not that much. TO be honest, I have a few installed (Heroic Games Launcher for one), but the main one I wanted to avoid was Firefox - which is easily doable. It is annoying that we have yet another way of packaging apps - would have been better if they just supported Flatpack
Do you ever find it "updated" to the snap version? I have Ubuntu on my work laptop and every so often after an update Firefox will suddenly be the snap version and I'll have to reinstall it.
As someone else says, for Firefox (and Thunderbird) I just uninstalled the package manager version entirely and dropped Mozilla's regular distro-agnostic binary tarballs in my home folder. Using the built-in update systems also avoids that problem from .deb versions where updating the package could make the browser yell at you that it needs to be restarted when you try to open a tab.
I no longer remember all the exact steps I did but I only googled them in the first place so presumably they are there to be googled still. But it's possible to fully remove snapd and all snap support and then taboo it so that it never comes back. Or at least, it's been a few years and it hasn't come back. FF has remained a real .deb from the mozillateam ppa. It was a few different steps though not just uninstalling a few packages but also editing some apt config files I think. Sorry that sounds useless but like I say I just googled it up at the time, did 15-20 minutes of reading and poking, and never had to touch it again since then. It's been several version bumps.
..edit..
I installed a dummy package that displaces the nagware about the pro version too so I never get those messages during apt update any more.
Taking a quick definitely incomplete look I see at least:
and removed ubuntu-pro-esm-apps and ubuntu-pro-esm-infra from that same dir
but also there is a mozillateam ppa in sources.list.d, and I don't see any installed package name that looks like it might be that dummy ubuntu-pro-esm thing, so maybe it got removed during a version upgrade and I never noticed because ubuntu stopped that nonsense and it isn't needed any more? Or there is some other config somewhere I'm forgetting that is keeping that hole plugged.
Anyway, it WAS a little bit of fiddling around one day, but at least it was only a one and done thing so far.
I kind of expected to be off of ubuntu by now because once someone starts doing anything like that, it doesn't matter if you can work around it, the real problem is that they want to do things like that at all in the first place. Well they still want what they want and that problem is never going away. They will just keep trying some other thing and then some other thing. So rather that fight them forever, it's better to find someone else who you don't want to fight. I mean that's why we're on Linux at all in the first place right? But so far it's been a few version bumps since then and still more or less fine.
I recommend downloading the executable-in-a-tarball form of Firefox and running that. I personally do that with Nightly, and I find it works quite well.
I see not being able to install invasive kernel level anti-cheat as a positive. I uninstalled all Riot games before they rolled it out. I would’ve been pretty miffed if I had accidentally gotten their kernel modules simply because I wasn’t reading tech news before the auto update.
I used multi seat in Linux with SystemD, i just threw in some old grapchics cards and sound cards in my gaming PC so that the children could play on separate monitors while I worked. Multi seat is very cool. When upgrading to a new gaming PC it was much cheaper to build 4 separate machines because cpu's and motherboards with enough pcie lanes are very expensive.
GPU's still run at decent performance with half the pcie lanes available, so if you already got a gaming PC with many slots and dont need top performance it could still be worth it to get two more cheap gpus and use multi seats - for those building a mini lan gaming room at home.
One annoying thing is that linux cant run many different GPU drivers at the same time, so you have to make sure the cards work with the same driver.
Properitary 3rd party multi seat also exist for Windows, but Linux has built in support and its free.
I am super curious about your setup. I played with MS years ago, but I lost the need. It is a super cool tech that I'd love to see its efficiencies embraced in some way.
league of legends is basically the only thing holding me back from switching to Linux for myself :/ really want to just swap over to linux fully. love your website + house!
Does this mean the GitHub repo linked with the scripts now include up to date linux versions? Last time I looked it was all windows specific, but I'd love to setup something similar with stations for (much lower power) versions.
Sorry, I haven't gotten around to updating it yet, although it basically works to follow the same instructions except replace Windows with Linux and skip all the workarounds for Windows-specific bugs.
I'm sorry if you hear this a lot, but your house is so cool, and I must admit I am more than a little jealous.
I've also said it here before but I will just give up on PC gaming wholesale before I go back to Windows. It's crazy how much gaming on Linux has improved in just the past couple years.
I find that Fedora hits the right balance of stability while being up to date for anything desktop and specifically gaming focused, Debian has different priorities and packages can be a bit too old. And it’s less of a faff than Arch.
Archlinux can be a pretty good choice for gaming. Not necessarily because of anything Archlinux does: most distros can do anything, if you configure them.
No, just because the Steamdeck's distro is built on Arch, and so you can piggyback on what they are doing.
Eh, aside from GPU drivers -- which I download directly from nvidia anyway -- I don't feel like gaming is much affected by the distro packages being a couple years old. We pretty much just run Steam, Discord, and Chrome on these things, and those all have their own update schedule independent of the distro.
I download the nvidia drivers directly from nvidia. Their installer script is actually pretty decent and then I don't have to worry about whether the distro packages are up-to-date.
Battlefield 4's anticheat runs fine on Linux, if you end up needing one. It definitely slakes my BF fix, in the same way Deadlock is filling the LoL-shaped hole in my contemptible subsistence.
Your house sounds like a great place to hold a fighting game local tournament (or something like the old Smash Summit series for Smash Bros Melee and Ultimate before Beyond The Summit shut down)
I think its interesting that mainstream PC gaming press is now talking about Linux. We have the benchmark Youtube channels doing some benchmarks of it as well and plenty of reports of "it just works", which is pretty promising at least for the games that aren't intentionally excluded by DRM. For me its still controllers and equipment incompatibility due to my VR headset and sim wheel/pedals setup, I use Linux everywhere else in my router and home servers. I just hope that Nvidia notices that there does appear to be a swing happening and improves their driver situation.
I'd say there are two remaining roadblocks. First and biggest is kernel level anti-cheat frameworks as you point out. But there's also no open source HDMI 2.1 implementation allowed by the HDMI cartel so people like me with an AMD card max out at 4K60 even for open source games like Visual Pinball (unless you count an adapter with hacked firmware between the card and the display). NVidia and Intel get away with it because they implement the functionality in their closed source blobs.
It's a blocker if you want to use a TV, there are almost 0 TVs with DP. This HDMI licensing crap is also the reason a Steam Deck can't output HDMI > 4K@60 unless you install Windows on it.
Yup this works but there's as of yet no HBR13.5 or better input so you're not getting full hdmi 2.1 equivalent. But if you don't care about 24 bits per pixel DSC then you can have an otherwise flawless 4k120hz experience.
Up until a year or two ago, the majority of monitors (and graphic cards) used DisplayPort 1.4 and HDMI 2.1. With HDMI 2.1 (42 Gbps) having more bandwidth than the DisplayPort (26 Gbps).
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I had said I wouldn’t upgrade from my RTX 3080 until I could run “true 4K”.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
Competent cheat makers don't have much difficulty in defeating in-kernel anticheats on Windows. With the amount of insight and control available on Linux anticheat makers stand little chance.
The best Valve could do is offer a special locked down kernel with perhaps some anticheat capabilities and lock down the hardware with attestation. If they offer the sources and do verified builds it might even be accepted by some.
Doubt it would be popular or even successful on non-Valve machines. But I'm not an online gamer and couldn't care less about anticheats.
Anticheat is one of those things where I probably sound really old, but man it’s just a game. If you hate cheating, don’t play on pub servers with randoms or find a group of people you can play with, like how real life works.
For competitive gaming, I think attested hardware & software actually is the right way to go. Don’t force kernel-level malware on everyone.
Sorry but you're just old IMO :) PUBG or Arc Raiders have over 100 players in a game. Even Valorant or League have 10 players in a match. It's definitely not easy to find 9 friends to play the same game at the same time as you. And playing any of these games with a cheater can completely wreck the match. If the cheaters go unchecked, over time they start to dominate games where like 30% might be cheaters who can see through walls and insta headshot you and the entire multiplayer mode of the game is ruined. Even worse some cheaters are sneaky, they might have a wallhack or a map showing all players but use it cautiously and it can be quite hard to prove they're cheating but they build up a huge advantage nonetheless. Most of us are happy to have effective anti-cheat, and it's not forced upon us. I understand the tradeoff to having mostly cheater-free games is having to trust the game maker more and am fine with that. Riot for example is quite transparent about what their anti-cheat does, how it works and I don't consider it "malware" anymore than I consider a driver for my graphics card to be "malware" even if they do operate in kernel mode.
This was never an issue 20 years ago when we had 64 player servers, but the 64 player servers also generally had a few people online with referee access to kick/ban people at any given time. That seemed like it worked well to me.
It was still an issue enough that some developers made BattlEye for anti-cheat 20 years ago for Battlefield games. It's still one of the more popular anticheats today.
Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.
That's really the paradigm shift - communities were self-organizing and self-moderating before. Now game publishers want to control all aspects of the online experience so they can sell you content and skins, so that means matchmaking and it means they have to shoulder the moderation burden.
> Most of us are happy to have effective anti-cheat
I could almost get on board with the idea of invasive kernel anti-cheat software if it actually was effective, but these games still have cheaters. So you get the worst of both worlds--you have to accept the security and portability problems as a condition for playing the game AND there are still cheaters!
Worst of both worlds? In theory this is accurate, in practice, it isn’t. The crux of why people are fine with it as far as I can identify is “but these games still have cheaters” - people aren’t looking for 0 cheaters so much as < X% are cheaters, keeping the odds low than any given match they are in has a cheater.
> I don't consider it "malware" anymore than I consider a driver for my graphics card to be "malware" even if they do operate in kernel mode.
the bloggers/journalists calling it malware is doing the conversation a disservice. The problem is only really the risk of bugs or problems with kernel level anti-cheat, which _could_ be exploited in the worst case, and in the best case, cause outages.
The classic example recently is the crowdstrike triggered outtage of computers worldwide due to kernel level antivirus/malware scanning. Anti-cheat could potentially have the exact same outcome (but perhaps smaller in scale as only gamers would have it).
If windows created a better framework, it is feasible that such errors are recoverable from and fixable without outages.
I play a lot of dota 2 and never really notice anything that is obvious cheat wise. IMO league would probably be fine to do valve level anti cheat, it's even a less twitchy of a game than dota.
FPSs can just say 'the console is the competitive ranked' machine, add mouse + keyboard support and call it a day. But in those games cheaters can really ruin things with aimbots, so maybe it is necessary for the ecosystem, I dunno.
Nobody plays RTSs competitively anymore and low-twitch MMOs need better data hiding for what they send clients so 'cheating' is not relevant.
We are at the point where camera + modded input devices are cheap and easy enough I dunno if anti-cheat matters anymore.
You clearly don’t play competitive shooters and thus aren’t qualified to opine on the matter.
Competition vs other human beings is the entire point of that genre, and the intensity when you’re in the top .1% of the playerbase in Overwatch/Valorant/CSGO is really unmatched.
I think the problem comes when someone makes a cool, fun, silly little game that is otherwise great when played with randoms, and cheating just sorta spoils it.
Case in point from a few years back - Fall Guys. Silly fun, sloppy controls, a laugh. And then you get people literally flying around because they've installed a hack, so other players can't progress as they can't make the top X players in a round.
So to throw it back - it is just a game, it's so sad that a minority think winning is more important than just enjoying things, or think their own enjoyment is more important than everyone else's.
As an old-timer myself, we thought it was despicable when people replaced downloaded skins in QuakeWorld with all-fullbright versions in their local client, so they could get an advantage spotting other players... I suppose that does show us that multiplayer cheating is almost as old as internet gaming.
Not a gamer, but it seems like super competitive games should be played on locked down consoles not custom-built PCs where the players have full control?
Also, for more casual play, don't players have rankings so that you play with others about your level? Cheaters would alll end up just playing with other cheaters in that case, wouldn't they?
This seems both semi probably but also like maybe a bit of a critical moral hazard for Valve. Right now folks love Valve. They do good things for Linux.
Making a Valve-only Linux solution would take a lot of the joy of this moment away for many. But it would also help Valve significantly. It's very uncomfortable to consider, imo.
You don't have to play these specific games though. I mean, what's your privacy, what's not being bombarded by ads in your OS worth to you? Have you taken an honest thought about this?
If you want to play games with friends, you have to play whatever the group plays. This is especially problematic as the group tries out new games, increasing the chance you can’t join because you’re not on Windows.
Personally I'd be interested to see what would happen if Sony/MS did what they could to make keyboard/mouse experience as good as possible on their consoles (I'm writing from a position of ignorance on the state of mouse/keys with current consoles) and encouraged developers to offer a choice in inputs, so that the locked-down machines can become the place for highest confidence in no/low cheaters. If other people want to pay through the nose to go beyond what consoles offer on the detail/resolution/framerate trifecta then I'm sure they could do so, but I really don't see how you lock down an open platform. That challenge has been going for decades.
> I'm writing from a position of ignorance on the state of mouse/keys with current consoles
I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.
This really depends on the friends you have. I've never encountered this limitation because no one in my friend group plays competitive ranked games. Basically anything with private sessions doesn't require anticheat, so Valheim, RV There Yet, Deep Rock Galactic, etc. all work fine.
Yes, but Linux really has gotten a lot better in recent years. At least whatever runs on Steam. I almost never had any problems with newer indie games.
My friends are understanding that I don't play games with rootkit anti cheat (whether on Linux or Windows). There are enough games that we can play other games together still, and when they want to play the games with such anti-cheat (e.g. Helldivers 2) they simply play without me. No big deal.
Yes, but sometimes it is nice to socialize with other people and they might play these types of games. I don’t enjoy Call of Duty, but I’ll play it from time to time so I can chat with my brother (this is the only way to get him on the phone/microphone for some reason). I value the time I am spending with him more than a bit of privacy (in that context).
I am very pro-Linux and pro-privacy, and hope that the situation improves so I don’t have to continue to compromise.
besides ads and privacy concerns it's been such a delight not having to deal with unwanted updates, hunting phantom processes that take up cpu time, or the file explorer that takes forever to show ten files in the download folder. I cannot be paid to use windows at this point.
Another unresolved roadblock is Nvidia cards seriously underperforming in DX12 games under Proton compared to Windows. Implementing DX12 semantics on top of Vulkan runs into some nasty performance cliffs on their hardware, so Khronos is working on amending the Vulkan spec to smooth that over.
What percentage of games require DX12? From what I recall, a surprisingly large percentage of games support DX11, including Arc Raiders, BF6 and Helldivers 2, just to name a few popular titles.
At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
DX12 is overwhelmingly the default for AAA games at this point. The three titles you listed all officially require DX12, what DX11 support they have is vestigial, undocumented and unsupported. Many other AAAs have already stripped their legacy DX11 support out entirely.
Id Software do prefer Vulkan but they are an outlier.
DX12 is less and less the default, most gamedev that I’ve seen is surrounding Vulkan now.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
I always wondered. Isn't exactly what eBPF would allow you to do?
Assuming that cheats work by reading (and modifying) the memory of the game process you can you can attach a kprobe to the sys_ptrace system call. Every time any process uses it, your eBPF program triggers. You can then capture the PID and UID of the requester and compare it against a whitelist (eg only the game engine can mess with the memory of that process). If the requester is unauthorized, the eBPF program can even override the return value to deny access before the kernel finishes the request.
Of course there are other attack vectors (like spoofing PID/process name), but eBPF covers them also.
All of this to say that Linux already has sane primitives to allow that, but that, as long as devs don't prioritize Linux, we won't see this happening.
but how does the anti-cheat know that the kernel is not modified such that it disables certain eBPF programs (or misreports cheats/spoofs data etc)?
This is the problem with anti-cheat in general (and the same exists with DRM) - the machine is (supposedly) under the user's total control and therefore, unless your anti-cheat is running at the lowest level, outside of the control of the user's tampering, it is not trustworthy. This leads to TPM requirements and other anti-user measures that are dressed as pro-user in windows.
There's no such thing in linux, which makes it inoperable as one of these anti-cheat platforms imho.
Great point. As I mentioned there are other attack vectors and you can mitigate them. For mitigating what you are mentioning for instance you don't just run one eBPF program, but you run a cluster of them that watch each other:
(The following was refined by an LLM because I didn't remember the details of when I was pondering this a while back)
All your anti cheats are eBPF programs hooked to the bpf() syscall itself.
Whenever any process tries to call BPF_PROG_DETACH or BPF_LINK_DETACH, your monitors check if the target is one of the anti cheats in your cluster of anti-cheats.
If an unauthorized process (even Root) tries to detach any of your anti-cheat processes, the eBPF program uses bpf_override_return to send an EPERM (Permission Denied) error back to the cheat.
(End LLM part)
Of course, you can always circumvent this by modifying and compiling the kernel so that those syscalls when targeting a specific PID/process name/UID aren't triggered. But this raises the difficulty of cheating a lot as you can't simply download a script, but you need to install and boot a custom kernel.
So this would solve the random user cheating on an online match. Pro users that have enough motivation can and will cheat anyway, but that is true also on windows. Finally at top gaming events there is so much scrutiny as you need to play on stage on vetted PCs that this is a non-issue
You can't, but circumventing anti cheats already happens on windows with all their fancy kernel level anti cheats.
I believe the goal is to make it so uncomfortable and painful that 99.999% of the users will say fuck it and they won't do it. In this case users need to boot a custom kernel that they download from the internet which might contain key-loggers and other nasty things. It is not just download a script and execute it.
For cheat developers, instead, this implies doing the modifications to allow those sys-calls to fly under the radar while keeping the system bootable and usable. This might not be trivial.
Well yeah but then eBPF would not work and then the anti cheat could just show that it's not working and lock you out.
This isn't complicated.
Even the Crowdstrike falcon agent has switched to bpf because it lowers the risk that a kernel driver will brick downstream like what happened with windows that one time. I recently configured a corporate single sign on to simply not work if the bpf component was disabled.
Well but then attackers just compile a kernel with a rootkit that hides the hack and itself from the APIs of the BPF program, so it has to deal with that too or it's trivially bypassed.
Anticheat and antivirus are two similar but different games. It's very complicated.
The bpf api isn't the only telemetry source for an anti cheat module. There's a lot of other things you can look at. A bpf api showing blanks for known pid descendent trees would be a big red flag. You're right that it's very complicated but the toolchain is there if someone wanted to do the hard work of making an attempt. It's really telemetry forensics and what can you do if the cheat is external to the system.
I'd be less antianticheat if I could just select the handcuffs at boot time for the rare occasion where I need them.
Although even then I'd still have qualms about paying for the creation of something that might pave the path for hardware vendors to work with authoritarian governments to restrict users to approved kernel builds. The potential harms are just not in the same league as whatever problems it might solve for gamers.
Once a slave, always a slave. Running an explicitly anti-user proprietary kernel module that does god-knows-what is not something I'd ever be willing to do, games be damned. It might just inject exploits into all of your binaries and you'd be none the wiser. Since it wouldn't work on VMs you'd have to use a dedicated physical machine for it. Seems to high of a price to play just a few games.
Being able to snapshot and restore memory is a pretty common feature across all decent hypervisors. That in and of itself enables most client-side cheats. I doubt they'd bother to provide such a hypervisor for the vanishingly small intersection of people who:
- Want to play these adversarial games
- Don't care about compromising control of hypervisor
>Being able to snapshot and restore memory is a pretty common feature across all decent hypervisors
A hypervisor that protects against this already exists for Linux with Android's pKVM. Android properly enforces isolation between all guests.
Desktop Linux distros are way behind in terms of security compared to Android. If desktop Linux users ever want L1 DRM to work to get access to high resolution movies and such they are going to need such a hypervisor. This is not a niche use case.
It "protects" against this given the user already does not control the hypervisor, at which point all bets are off with regard to your rights anyway. It's actually worse than Windows in this regard.
I would never use a computer I don't have full control over as my main desktop, especially not to satisfy an external party's desire for control. It seems a lot more convenient to just use a separate machine.
Even mainstream consumers are getting tired of DRM crap ruining their games and movies. I doubt there is a significant Linux users would actually want to compromise their ownership of the computer just to watch movies or play games.
I do agree that Linux userland security is lackluster though. Flatpak seems to be a neat advancement, at least in regard to stopping things from basically uploading your filesystems. There is already a lot of kernel interfaces that can do this like user namespaces. I wish someone would come up with something like QubesOS, but making use of containers instead of VMs and Wayland proxies for better performance.
You already don't control the firmware on the CPU. Would you be okay with this if the hypervisor was moved into the firmware of the CPU and other components instead?
I honestly think you would be content as long as the computer offered the ability to host an arbitrary operating system just like has always been possible. Just because there may be an optional guest running that you can't fully control that doesn't take away from the ability to have an arbitrary guest you can fully customize.
>to satisfy an external party's desire for control.
The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
>It seems a lot more convenient to just use a separate machine.
It really isn't. It's much more convenient to launch a game on the computer you are already using than going to a separate one.
Ah, I see, you're talking about Intel ME/AMD PSP? That's unfortunate and I'm obviously not happy with it, but so far there seems to be no evidence of it being abused against normal users.
It's a little funny that the two interests of adtech are colliding a bit here: They want maximum control and data collection, but implementing control in a palatable way (like you describe) would limit their data collection abilities.
My answer to your question: No, I don't like it at all, even if I fully trust the hypervisor. It will reduce the barrier for implementing all kinds of anti-user technologies. If that were possible, it will quickly be required to interact with everything, and your arbitrary guest will soon be pretty useless, just like the "integrity" bullshit on Android. Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc. That's still a net minus compared to the status quo.
In general, I dislike any methods that try to apply an arbitrary set of criteria to entitle you to a "free" service to prevent "abuse", be it captchas, play integrity, or Altman's worldcoin. That "abuse" is just rational behavior from misaligned incentives, because non-market mechanisms like this are fundamentally flawed and there is always a large incentive to exploit it. They want to have their cake and eat it too, by eating your cake. I don't want to let them have their way.
> The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
Pretty sure we already have enough technology to fully automate many games with robotics. If there is a will, there is a way. As with everything else on the internet, everyone you don't know will be considered untrusted by default. Not the happiest outcome, but I prefer it to losing general purpose computing.
I'm talking about the entire chip. You are unable to implement a new instruction for the CPU for example. Only Intel or AMD can do so. You already don't have full control over the CPU. You only have as much control as the documentation for the computer gives you. The idea of full control is not a real thing and it is not necessary for a computer to be useful or accomplish what you want.
>and your arbitrary guest will soon be pretty useless
If software doesn't want to support insecure guests, the option is between being unable to use it, or being able to use it in a secure guest. Your entire computer will become useless without the secure guest.
>Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc.
This could be handled by also running another guest that was supported by those app developers that provide the required security requirements compared to your arbitrary one.
>That "abuse" is just rational behavior from misaligned incentives
Often these can't be fixed or would result in a poor user experience for everyone due to a few bad actors. If your answer is to just not build the app in the first place, that is not a satisfying answer. It's a net positive to be able to do things like watch movies for free on YouTube. It's beneficial for all parties. I don't think it is in anyone's best interest to not do such a thing because there isn't a proper market incentive in place stop people from ripping the movie.
>If there is a will, there is a way.
The goal of anticheat is to minimize customer frustration caused due to cheaters. It can still be successful even if it technically does not stop every possible cheat.
>general purpose computing
General purpose computing will always be possible. It just will no longer be the wild west anymore where there was no security and every program could mess with every other program. Within a program's own context it is able still do whatever it wants, you can implement a Turing machine (bar the infinite memory).
They certainly aren't perfect, but they don't seem to be hell-bent on spying on or shoving crap into my face every waking hour for the time being.
> insecure guests
"Insecure" for the program against the user. It's such a dystopian idea that I don't know what to respond with.
> required security requirements
I don't believe any external party has the right to require me to use my own property in a certain way. This ends freedom as we know it. The most immediate consequences is we'd be subject to more ads with no way to opt out, but that would just be the beginning.
> stop people from ripping the movie
This is physically impossible anyway. There's always the analog hole, recording screens, etc, and I'm sure AI denoising will close the gap in quality.
> it technically does not stop every possible cheat
The bar gets lower by the day with locally deployable AI. We'd lose all this freedom for nothing at the end of the day. If you don't want cheating, the game needs to be played in a supervised context, just like how students take exams or sports competitions have referees.
And these are my concerns with your ideal "hypervisor" provided by a benevolent party. In this world we live in, the hypervisor is provided by the same people who don't want you to have any control whatsoever, and would probably inject ads/backdoors/telemetry into your "free" guest anyway. After all, they've gotten away with worse.
Yep, a plenty of prior art on how to implement the necessary attestations. Valve could totally ship their boxes with support for anticheat kernel-attestation.
Is it possible to do this in a relatively hardware-agnostic, but reliable manner? Probably not.
What do you mean? Ship computer with preinstalled Linux that you can't tamper? Sounds like Android. For ordinary computers, secure boot is fully configurable, so it won't work: I can disable it, I can install my own keys, etc. Any for any userspace way to check it I'll fool you, if I own the kernel.
No, just have the anti-cheat trust kernels signed by the major Linux vendors and use secure boot with remote attestation. Remote attestation can't be fooled from kernel space, that's the entire point of the technology.
That way you could use an official kernel from Fedora, Ubuntu, Debian, Arch etc. A custom one wouldn't be supported but that's significantly better than blocking things universally.
You can't implement remote attestation without a full chain of exploits (from the perspective of the user). Remote attestation works on Android because there is dedicated hardware to directly establish communication with Google's servers that runs independent (as a backchannel). There is no such hardware in PCs. Software based attestation is easily fooled on previous Android/Linux.
The call asks the TPM to display the signed boot chain, you can't fake that because it wouldnt be cryptographically valid. The TPM is that independent hardware.
How would that be implemented? I'd be curious to know.
I'm not aware that a TPM is capable of hiding a key without the OS being able to access/unseal it at some point. It can display a signed boot chain but what would it be signed with?
If it's not signed with a key out of the reach of the system, you can always implement a fake driver pretty easily to spoof it.
Ah, got it. With enough motivation this is still pretty easily defeated though. The key is in some kind of NVRAM, which can be read with specialized equipment, and once it's out, you can use it to spoof signatures on a different machine and cheat as usual. The TPM implementations of a lot of consumer hardware is also rather questionable.
These attestation methods would probably work well enough if you pin a specific key like for a hardened anti-evil-maid setup in a colo, but I doubt it'd work if it trusts a large number of vendor keys by default.
Once it's out you could but EKs are unique and tied to hardware. Using an EK to sign a boot state on hardware that doesn't match is a flag to an anti-cheat tool, and would only ever work for one person.
It also means that if you do get banned for any reason (obvious cheating) they then ban the EK and you need to go source more hardware.
It's not perfect but it raises the bar significantly for cheaters to the point that they don't bother.
> Using an EK to sign a boot state on hardware that doesn't match is a flag to an anti-cheat tool
The idea is you implement a fake driver to sign whatever message you want and totally faking your hardware list too. As long as they are relatively similar models I doubt there's a good way to tell.
Yeah, I think there are much easier ways to cheat at this point, like robotics/special hardware, so it probably does raise the bar.
Basically TPM includes key that's also signed with manufacturer key. You can't just extract it and signature ensures that this key is "trusted". When asked, TPM will return boot chain (including bootloader or UKI hash), signed by its own key which you can present to remote party. The whole protocol is more complicated and includes challenge.
Tpm isn't designed for this use case. You can use it for disk encryption or for identity attestation but step 1 for id attestation is asking the tpm to generate a key and then trusting that fingerprint from then on after doing a test sign with a binary blob. The running kernel is just a binary that can be hashed and whitelisted by a user space application. Don't need tpm for that.
I wonder if you could use check-point and restore in userspace (https://criu.org/Main_Page) so that after the game boots and passes the checks on a valid system you can move it to an "invalid" system (where you have all the mods and all the tools to tamper with it).
I don't really care about games, but i do care about messing up people and companies that do such heinous crimes against humanity (kernel-level anti-cheat).
The war is lost. The most popular game that refuses to use kernel-level anti-cheat is Valve's Counter-Strike 2, so the community implemented it themselves (FaceIT) and requires it for the competitive scene.
Uh, you'd have to compile a Kernel that doesn't allow it while claiming it does ... And behaves as if it does - otherwise you'd just fail the check, no?
I feel like this is way overstated, it's not that easy to do, and could conceptually be done on windows too via hardware simulation/virtual machines. Both would require significant investments in development to pull of
Right, the very thing that works against AC on Linux also works for it. There are multiple layers (don't forget Wine/Proton) to inject a cheat, but those same layers could also be exploited to detect cheats (especially adding fingerprints over time and issuing massive ban-waves).
And then you have BasicallyHomeless on YouTube who is stimulating nerves and using actuators to "cheat." With the likes of the RP2040, even something like an aim-correcting mouse becomes completely cheap and trivial. There is a sweet-spot for AC and I feel like kernel-level might be a bit too far.
All it takes is going to cd usr src linux and running make menuconfig. Turning off a few build flags. Hitting save. And then running make to recompile. But that's like saying "well if I remove a fat32 support I can't use fat32". Yea it will lock you out showing you have it disabled. No big deal.
That would require that they actually make the effort to develop Linux support. The current "it just works" reality is that the games developers don't need to support running on Linux.
I am wondering can game be shipped with their own "kernel" and "hypervisor", basically an entire VM. Yes performance will take a hit, but in my experience with my own VM, it's like 15-20%.
Clearly, when there will be enough Linux gamers another solution to the kernel-level anti-cheat issue will be found. After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.
> After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
The best description I've been able to give of the dichotomy of CS is this: there is no way for a person to become good enough to get their signature into the game, without using kernel-level ACs.
The competitive CS leagues do use AC though. The big issue for these games is the free-to-play model does not work without anti-cheat. Having a ~$20 fee to cheat for a while before getting banned significantly reduces the number of cheaters, and that's what CS does with their prime server model.
And for what it's worth, I'm pretty sure Valorant is the most played competitive shooter at the moment.
> I just hope that Nvidia notices that there does appear to be a swing happening and improves their driver situation.
I firmly believe that Nvidia doesn't want the general public to ever have better hardware than what is current as people could just run their own local models and take away from the ridiculous money they're making from data centers.
In step they're now renting their gaming GPUs to players with their GeForce now package.
The market share for Nvidia of gamers is a rounding error now against ai datacenter orders. I won't hold my breath about them revisiting their established drivers for Linux.
When that steam deck clone came out and games played better on SteamOS than on Windows on the exact same hardware, it woke a bunch of people up. Microsoft scrambled to bring the startup time and footprint down but shots had already been fired.
You don’t want a vendor you have to publically shame to get them to do the right thing. And that’s MS if any single sentence has ever described them without using curse words.
I've got the Legion Go S with Steam OS, and that shit is great. It's stable, my games run well, the OS is pretty much entirely in the background, but I can still access it fully if I need to. Love it.
Funnily enough the most annoying things on my system at the moment is RGB and keyboard/mouse customisation.
I haven’t found a tool that can access all the extra settings of my Logitech mouse, not my Logitech speakers.
OpenRGB is amazing but I’m stuck on a version that constantly crashes; this should be fixed in the recent versions but nixpkgs doesn’t seem to have it (last I checked).
On the other hand I did manage to get SteamVR somewhat working with ALVR on the Quest 3, but performance wasn’t great or consistent at all from what I remember (RTX 3070, Wayland KDE).
I dont get the feeling they care. Microsoft is so lost under Satya at this point. Totally blinded by Azure and AI and stock price growth. At some point they're going to realize all the ground they've lost and it's going to be a real problem. They're repeating a lot of the same mistakes that cost them the browser and mobile market.
Yeah. MS must have been so hurt about losing to the iPhone, they really jumped the gun on AI as if to avoid a similar mistake. It's Satya's major play and I think they are already paying for that decision. xbox is hollowed out so that AI can be funded, while the pc/console hybrid project is doomed to fail because "windows everywhere" doesn't work if windows is crap. indeed, they might be left with just the cloud business in the end.
"Totally blinded by Azure and AI and stock price growth."
Stock price growth is their core business because that is how large firms operate.
MS used to embrace games etc because the whole point was all PCs should run Windows. Now the plan is to get you onto a subscription to their cloud. The PC bit is largely immaterial in that model. Enterprises get the rather horrible Intune bollocks to play with but the goal is to lock everyone into subs.
When the rumour was Windows 10 will be the last windows! I don't think people thought it would because of win11 would be so unbearable it would finally drive users to Linux.. but here we are. RIP.
If people were buying new PCs every year like they used to I'd be worth it. Turns out there isn't as much value having a "captive market" on a PC unless it's locked down.
The irony is that gaming on linux got better but the instigator was not the OSS community. All of it was funded by closed source software competing with other close source software. The OSS community by itself did not have the conviction to climb over this bulwark.
But when Steam started to develop Proton, WINE was 90% there! Valve only had to provide the remaining 90%.
The strength of Linux and Free software in general is not in that it's completely built by unpaid labor. It's built by a lot of paid, full-time labor. But the results are shared with everyone. The strength of Free software is that it fosters and enforces cooperation of all interested parties, and provides a guarantee that defection is an unprofitable move.
This is one of the reasons you see Linux everywhere, and *BSD, rarely.
> This is one of the reasons you see Linux everywhere, and *BSD, rarely.
I doubt it's a large reason. I'd put more weight on eg Linus being a great project lead and he happens to work on Linux. And a lot of other historical contingencies.
BSD does a few things right, hence it's used by Netflix (who share back some of their work), userland of macOS (because Apple don't like GPL, I assume), PS4 and PS5 (IDK if anything seeps back upstream from there).
It isn’t about conviction. Gaming takes tremendous resources and they were not there. But if this starts shifting the tides there is a possible future where game developers start building for Linux as a primary target and to run games on Windows or Mac you would use emulation. In fact this seems like a better overall approach given that there are no hidden APIs with Linux.
Money and resources suddenly materialized once someone realized that there was profit in it is pretty much the expected way this goes. OpenTofu happened not because of some OSS force of will but because a group of companies needed it to exist for their business.
This flow is basically the bread and butter for the OSS community and the only way high effort projects get done.
This still has a "sometimes" on it, there are more then a few games that need magic proton flags to run well, nothing you can't go look up on protondb, but lots of games you would want to play with friends might have some nasty anti-cheat on it that just won't let you play it at all.
Gaming works fine with exception of things like BF6 that require kernel level anti cheat.
The one thing I haven’t been able to get working reliably is steam remote play with the Linux machine as host. Most games work fine, others will only capture black screens.
Proton has gotten so good now that I don't even bother checking compatibility before buying games.
Granted, I don't play online games, so that might change things, but for years I used to have to make a concession that "yeah Windows is better for games...", but in the last couple years that simply has not been true. Games seem to run better on Linux than Windows, and I don't have to deal with a bunch of Microsoft advertising bullshit.
Hell, even the Microsoft Xbox One controllers work perfectly fine with xpad and the SteamOS/tenfoot interface recognizes it as an Xbox pad immediately, and this is with the official Microsoft Xbox dongle.
At this point, the only valid excuses to stay on Windows, in my opinion, are online games and Microsoft Office. I don't use Office since I've been on Unixey things so long that I've more or less just gotten used to its options, but I've been wholly unable to convince my parents to change.
I love my parents, but sometimes I want to kick their ass, because they can be a bit stuck in their ways; I am the one who is expected to fix their computer every time Windows decides to brick their computer, and they act like it's weird for me to ask them to install Linux. If I'm the one who has to perform unpaid maintenance on this I don't think it's weird for me to try and get them to use an operating system that has diagnostic tools that actually work.
As far as I can tell, the diagnostic and repair tools in Windows have never worked for any human in history, and they certainly have never worked for me. I don't see why anyone puts up with it when macOS and Linux have had tools that actually work for a very long time.
> At this point, the only valid excuses to stay on Windows, in my opinion
I didn’t see a performance increase moving to Linux for the vast majority of titles tested. Certainly not enough to outweigh the fact that I want EVERY game to work out of the box, and to never have to think if it will or won’t. And not all of my games did, and a not insignificant number needed serious tweaking to get working right.
I troubleshoot Linux issues all day long, I’ve zero interest in ever having to do it in my recreation time.
That’s a good enough reason for me to keep my windows box around.
I use Linux and OSX for everything that isn’t games, but windows functions just fine for me as a dumb console and I don’t seem to suffer any of these extreme and constant issues HN users seem to have with it from either a performance or reliability standpoint.
I’ll keep repeating it: the more people vote with their wallet, the more game companies will deploy Linux - including the anticheat.
EAC has the support for Linux, you just have to enable it as a developer.
I know this, I worked on games that used this. EAC was used on Stadia (which was a debian box) for the division, because the server had to detect that EAC was actually running on the client.
I feel like I bring this up all the time here but people don’t believe me for some reason.
This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.
But they work out of the box, which is my point. You can use a device that can be inbetween which places screen into fixed space in front of you for example. While it is cool, it is kind of a hassle to have this device inbetween. I just plug them directly and they work.
(cue arrogance)
People on HackerNews complaining about Linux Desktop is pretty disappointing. You guys are supposed to be the real enthusiasts... you can make it work.
(cue superiority complex)
I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively.
The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology)
I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
I've observed that most "enthusiasts" are really just brand ambassadors. They've been captured by some proprietary software that doesn't run on Linux, and that's the problem of Linux. The day their set of products runs perfectly on Linux is the day Linux will be ready for them.
I think that if affinity chooses to make it work well on linux that would be a game changer for a lot of people. daVinci resolve works on linux for video so having a proper photo editor/illustrator tool that is not gimp would open up the option for most people to daily drive it. that's really the missing piece.
I mean, yes. That's how people work: They don't care about the OS for itself, the OS is a means to run the software they want to run, and it'll be ready when it runs that software.
(I'm typing this on my Linux desktop right now... but also have a separate Windows PC for running the games I want to run that don't work on Linux yet. When they work, I'll be thrilled to put Linux on that machine or its successor.)
>but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows
Many games refuse to run in VM, even if that VM is windows one. I bet there is a trick to bypass, but then you are at risk of being banned or can't receive support when needed.
That isn't weird. It's by design. MacOS is only designed to run on Apple hardware, and a VM, even if the host is Apple hardware isn't really Apple hardware.
I'm tired of people saying Steam on Linux just works. It doesn't.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
These games are all rated gold or platinum on protondb, indicating that they work perfectly for most people.
Hard to say what might be going wrong for you without more details. I would guess there's something wrong with your video driver. Maybe you have an nvidia card and the OS has installed the nouveau drivers by default? Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things. This is indeed a sore spot for Linux gaming, though to be fair graphics driver problems are not exactly unheard of on Windows either.
Personally I have a bunch of machines dedicated to gaming in my house (https://lanparty.house) which have proven to be much more stable running Linux than they were with Windows. I think this is because the particular NIC in these machines just has terrible Windows drivers, but decent Linux drivers (and I am netbooting, so network driver stability is pretty critical to the whole system).
> Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things.
Crazy—it used to be that nvidia drivers were by far the least stable parts of an install, and nouveau was a giant leap forward. Good to know their software reputation has improved somewhat
Nouveau has never been good for gaming. Not their fault (they had to reverse engineer everything), but it was only really ever viable for mostly 2D desktops in my experience.
Everyone says this but it is not my experience at all. Every time I try AMD cards I run into weird problems. The Nvidia drivers are a pain to install and tend to break randomly on kernel updates, but once built properly they always just work for me...
I imagine the people saying “it just works” are saying it because it does, at least for them.
SteamOS is based on Arch, but customized and aimed at specific hardware configurations. It’d be interesting to know what hardware you’re using and if any of your components are not well supported.
FWIW, I’ve used Steam on Linux (mostly PopOS until this year, then Bazzite) for years and years without many problems. ISTR having to do something to make Quake III work a few years ago, but it ran fine after and I’ve recently reinstalled it and didn’t have to fuss with anything.
Granted, I don’t run a huge variety of games, but I’ve finished several or played for many hours without crashes, etc.
I use OpenSUSE Tumbleweed, and I've never had trouble running a game that's rated gold or above. I've even gotten an Easy AntiCheat game to work correctly.
I've been gaming on linux exclusively for about 8 years now and have had very few issues running windows games. Sometimes the windows version, run through proton, runs better than the native port. I don't tend to be playing AAA games right after launch day, though. So it could be taste is affecting my experience.
I just bought another second Dell workstation (admit I hated those) and can’t wait to install SteamOS when it is released to the public. I don’t care about AAA gaming but the integrated card should be able to handle most of the games from ten years ago.
> And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
This sounds like you are rejecting help because you have made up your mind in frustration already.
Because you are doing it wrong. If you want an OS that just works, you should use Ubuntu or Fedora. Why is SteamOS based on Arch then? Because Valve wants to tweak things in it and tinker with it themselves to get it how they like.
You don't.
So use an OS that requires less from you and that tries to just work out of the box, not one that is notorious for being something you break and tinker with constantly (Arch).
I don't have your other games, but I do have a few Worms games and they worked out of the box for me with GE Proton on NixOS.
I'm not saying "you're doing it wrong", because obviously if you're having trouble then that is, if nothing else, bad UX design, but I actually am kind of curious as to what you're doing different than me. I have an extremely vanilla NixOS setup that boots into GameScope + Tenfoot and I drive everything with a gamepad and it works about as easily as a console does for me.
If anything this is the challenge with PC as a platform being so varied, any random software/hardware/config variation could bring a whole load of quirks.
That probably includes anything that isn't a PC in a time-capsule from when the game originally released, so any OS/driver changes since then, and I don't think we've reached the point where we can emulate specific hardware models to plug into a VM. One of the reasons the geforce/radeon drivers (eg, the geforce "game ready" branding) are so big is that they carry a whole catalogue of quirk workarounds for when the game renderer is coded badly or to make it a better fit to hardware and lets them advertise +15% performance in a new version. Part of the work for wine/proton/dxvk is going to be replicating that instead of a blunt translation strictly to the standards.
Yeah, I think Linus himself pointed out that the desktop is the hardest platform to support because it's unbelievably diverse and varied.
With regards to Linux I generally just focus on hardware from brands that have historically had good Linux support, but that's just a rule of thumb, certainly not perfect.
Arch is nice if you want to tinker. Based on your reasoning, I wouldn't recommend it.
But if you still want arch-based, I would recommend EndevourOS, and for even a simpler/better distro, Bazzite.
You are definitely doing it wrong, I rarely have issues and when I do I just switch comparability tools. I play multiple indie games, marvel rivals, I played lots of among us on my machine in 2020. Running Pop OS
Yeah, the same. I sometimes google "wine WoW issues" and every time there are recent threads, so I don't even try. Linux has the long way to become gamer platform.
The games don't fail to run because they are so "graphically powerful" they fail to run because you chose to set up your system without the necessary runtime.
There are people who make stripped-down versions of windows. Is it fair to say that because these releases exist that windows isn't "just works" either?
One thing I'm curious about: how is the laptop dual-GPU situation these days?
I opted to install Linux in a VM under Hyper-V on Windows to avoid hassles with the dual GPUs in my ThinkPad P52, but this comes with several other hassles I'd like to avoid. (Like no GPU access in Linux at all...)
I switched my desktop from macOS (10+ years) to Ubuntu 25 last year and I'm not going back. The latest release includes a Gnome update which fixed some remaining annoyances with high res monitors.
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
I switched in 1999. I've never really had any problems in all that time.
Although it was to BSDi then, and then FreeBSD and then OpenBSD for 5 years or so. I can't remember why I switched to Debian but I've been there ever since.
> The latest release includes a Gnome update which fixed some remaining annoyances with high res monitors.
Interesting, I've had to switch off from Gnome after the new release changed the choices for HiDPI fractional scaling. Now, for my display, they only support "perfect vision" and "legally blind" scaling options.
By default Gnome doesn’t let you choose any fractional scaling in the UI because it has some remaining TODOs on that front. So from the UI you choose 100% or 200%. But the code is there and it works if you just open a terminal and type a command to enable this “experimental” feature.
Now whether or not this feature should have remained experimental is a different debate. I personally find that similar to the fact that Gmail has labeled itself beta for many years.
I've got the feature turned on. But Gnome 49 only supports fractional scaling ratios that divide your display into a whole, integer number of pixels. And they only calculate candidate ratios by dividing your resolution up to a denominator of 4.
So on my Framework 13, I no longer have the 150% option. I can pick 133%, double, or triple. 160% would be great, but that requires a denominator of 5, which Gnome doesn't evaluate. And you can't define your own values in monitors.xml anymore.
But what about laptops? I don’t use desktop machines anymore (last time was in 2012). Apple laptops are top notch. I use ubuntu as vm (headless) for software development tho
Best you can do is build a high end desktop at home and access it remotely with any laptop you desire. The laptop performance then becomes mostly irrelevant (even the OS is less relevant) and by using modern game streaming protocols you can actually get great image quality, low latency and 60+ fps. Though, optimizing it for low bandwidth is still a chore.
Have that desktop be reachable with SSH for all your CLI and sys admin needs, use sunshine/moonlight for the remote streaming and tailscale for securing and making sunshine globally available.
Bandwidth is not really a problem if you live in decent city. The problem is latency and data usage. 1 Hour streaming consumes GBs of data, that's a big problem if you use cellular network.
Latency is another problem, recently LTT video show that even as low as 5-10ms added latency can negatively impact your gaming performance, even if you don't notice. You begin to notice at around 20ms.
I don't have an x86 laptop at the moment so sticking with Macbook for now. My assumption is Mac laptops still are far superior given M-series chips and OS that are tuned for battery efficiency. Would love to find out this is no longer the case.
I've had Linux running on a variety of laptops since the noughties. I've had no more issues than with Windows. ndiswrapper was a bit shit but did work back in the day.
I haven't, because I buy hardware that's designed to work with Linux. But if you buy hardware that doesn't have Linux drivers, it just won't work. That might mean Wifi not working, it might mean a fingerprint reader not working, etc.
I have the HP Zbook Ultra G1a. AMD 395+, 129GB RAM, 4TB 2280 SSD. Works great with Ubuntu 24.04 and the OEM kernel. Plays Steam games, runs OpenCL AI models. Only nit is it is very picky on what USB PD chargers it will actually charge on at all. UGreen has a 140W that works.
I did some investigation into this the other day. The short answer seems to be that if you like MacBooks, you aren't willing to accept a downgrade along any axis, and you really want to use Linux, your best bet today is an M2 machine. But you'll still be sacrificing a few hours of battery life, Touch ID support (likely unfixable), and a handful of hardware support edge cases. Apple made M3s and M4s harder to support, so Linux is still playing catch-up on getting those usable.
Beyond that, Lunar Lake chips are evidently really really good. The Dell XPS line in particular shows a lot of promise for becoming a strict upgrade or sidegrade to the M2 line within a few years, assuming the haptic touchpad works as well as claimed. In the meantime, I'm sure the XPS is still great if you can live with some compromises, and it even has official Linux support.
What I mean is: on a normal laptop, when you scroll with two fingers on the scroll wheel, the distance you scroll is nearly a continuous function of how much you move your fingers; that is, if you only move your fingers a tiny bit, you will only scroll a few pixels or just one.
Most VM software (at least all of it that I've tried) doesn't properly emulate this. Instead, after you've moved your fingers some distance, it's translated to one discrete "tick" of a mouse scroll wheel, which causes the document to scroll a few lines.
The VM software I use is UTM, which is a frontend to QEMU or Apple Virtualization framework depending on which setting you pick when setting up the VM.
> Linux is still playing catch-up on getting those usable
This is an understatement. It is completely impossible to even attempt to install Linux at all on an M3 or M4, and AFAIK there have been no public reports of any progress or anyone working on it. (Maybe there are people working on it, I don’t know).
In his talk a few days ago, one of the main Asahi developers (Sven) shared that there is someone working on M3 support. There are screenshots of an M3 machine running Linux and playing DOOM at around 31:34 here: https://media.ccc.de/v/39c3-asahi-linux-porting-linux-to-app...
Sounds like the GPU architecture changed significantly with M3. With M4 and M5, the technique for efficiently reverse-engineering drivers using a hypervisor no longer works.
Not working with Linux is a function of Apple, not Linux. There is a crew who have wasted the last half decade trying to make Asahi Linux, a distro to run on ARM macbooks. The result is after all that time, getting an almost reasonably working OS on old hardware, Apple released the M4 and crippled the whole effort. There's been a lot of drama around the core team who have tried to cast blame, but it's clear they are frustrated by the fact that the OEM would rather Asahi didn't exist.
I can't personally consider a laptop which can't run linux "top notch." But I gave up on macbooks around 10 years ago. You can call me biased.
I just put Asahi on an M2 Air and it works so incredibly well that I was thinking this might finally be the year linux takes the desktop .. I wasn't aware of the drama w/Apple but I imagine M2 hardware will become valuable and sought after over M3+ just for the ability to run Asahi
Again, I've had two 4k monitors on Linux for about ten years, and it has worked well the whole time. Back then I used "gnome tweak" to increase the size of widgets etc. Nowadays its built into mate, cinnamon, etc.
Did you start using Linux on the Mac hardware or on PC hardware? I have a late era Intel Macbook and was considering switching it to Ubuntu or Debian since it is getting kinda slow.
Not the OP, but I have a 2015 Macbook Pro and a desktop PC both running Linux. I love Fedora, so that's on the desktop, but I followed online recommendations to put Mint on the Macbook and it seems to run very well. However, I did need to install mbpfan (https://github.com/linux-on-mac/mbpfan) to get more sane power options and this package (https://github.com/patjak/facetimehd) to get the camera working. It runs better than Mac OS, but you'll need to really tweak some power settings to get it to the efficiency of the older Mac versions.
I switched to a new x86 machine. Running Linux on Mac just made things unnecessarily complicated and hurt performance. Im still open to using docker on Mac to run Linux containers but once you want a GUI life was simpler when I switched off.
Linux is the best. Been using it since Slackware, RedHat first came out, now use Ubuntu or any distro which makes it easy to interact with its Desktop, e.g. GNOME :)
Long time Linux on the desktop
user here. I don't feel Linux has become significantly better recently. It's more that Windows reached a new low that is just below the threshold for many. Also, Apple, what are you doing?
I'd say it's both... In particular 6.16 seems to be a defining point in terms of stability and performance at least for me. My RX 9070XT is finally running with no issues since 6.16 that I've noticed in any of the admittedly few games I play.
Mesa, the kernel drivers and Proton have all seen a lot of growth this past year combined with a bunch of garbage decisions MS has doubled down on... not to mention, enough Linux users in tech combined with Valve/Steam's efforts have made it visible enough that even normies are considering giving Linux a try.
I'm not against Wayland, but I think Wayland is currently not good for the Linux ecosystem. I've had lots of friends try Linux, and they've had issues with Discord global keyboard shortcuts not working, and window positions not restoring at application start, and lots of other small issues, which add up in the end. But once they switched to X11, they've all been very happy.
Yup. I fully understand that X11 is a shitshow under the hood, but it works and Wayland frequently does not work. Screen recording, window positions, various multi-monitor and calibration issues, ...
On my laptop I use to write blog posts, that never ever gets plugged into a second screen? Sure, Wayland's great. On a computer that I expect normal people to be able to use without dumb problems? Hell no!
Comparing X11 and Wayland isn’t even correct because for a functional desktop you need Xwayland anyway. X11 never went away, we piled more code on top and now we have an eternal transitionary period and two ways of doing things.
I think Wayland is good for more technical users. Going from i3 to sway or bspwm to river feels like essentially nothing has changed. On the other hand, Gnome X11 to Wayland might be a bigger shock.
Unfortunately, Wayland inherently can't be like Pipewire, which instantly solved basically 90% of audio issues on Linux through its compatibility with Pulseaudio, while having (in my experience) zero drawbacks. If someone could make the equivalent of Pipewire for X11, that'd be nice. Probably far-fetched though.
It can absolutely be like that. Global keyboard shortcuts not working is a deliberate design choice in Wayland (as is non-foreground apps not having access to the clipboard).
"window positions not restoring at application start"
Well you see, you are actually just silly for wanting this or asking for this, because it's actually just a security flaw...or something. I will not elaborate further.
--geometry is an exploit that will end in your financial ruin. Spend your weekend figuring out which tiling manager and dbus commands will come close to approximating a replacement before giving up and realizing you can manually move windows for the rest of your life. Two plus two is five.
hyprland is a fun spectacle, but takes insane effort to make remotely livable. Also any apparent shortcut (dotfiles) will do nasty damage to your install. Anyone hypr-curious should sandbox in an install they don't mind wiping.
I’m really curious about your experience, what distro you used hyprland on, what dotfiles did damage to your install etc.
I just installed hyprland yesterday and outside of having to switch back to i3 once to install what they had set for a terminal in their default config(kitty), I haven’t had to leave again.
Curious: do enterprises using Windows suffer through all the system-level ads and nagware? Or do they get a version that lets their employees actually focus on work instead of learning the many reasons they should consider switching back to Edge?
It’s all turned on by default even in Windows 11 Enterprise. You can turn everything off via AD Group Policy or your MDM but you have to go through the labyrinth of Windows policies and find them all. Thankfully you only have to do it once and then push it to all of your devices.
No nagware but, at least on the machines of my colleagues, an even worse enemy: Microsoft Defender with all the checkboxes ticked. Grinds the machine to an absolute halt for any development work - sometimes the responsible security department has mercy and gives exceptions for certain folders/processes, sometimes not.
My work machine is grossly slow due to all the various security software.
Loading Teams can take minutes. I'm often late to meetings waiting for the damn thing to load.
Feels like early 90s computing and that Moore's Law was an excuse for bad coding practices and pushing newer hardware so that "shit you don't care about but is 'part of the system'" can do more monitoring and have more control of 'your' computer.
You _can_ curate the Enterprise edition a lot more with group policy/intune and remove all that stuff but my experience has been most corporate IT departments don’t care/don’t know how to do it, and MS will just randomly enable new things without asking the same as home editions and you have to keep an eye on it and go to disable them.
Recently switched to Linux Mint from Windows and it has not only been good. It has been cathartic. I enjoy computers again! I am self-hosting some services, what an absolute joy.
Switched in, ooh i dunno, '98 or '99. Quality is about where it was then relatively speaking. Sure things have improved, mainly just systemd, and we got ACPI and later power management stuff for laptops.
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
This is a strange statement for me, because I'd say that since '99 almost everything has changed. Maybe your definition of quality is a bit different than mine.
I tried to use Linux back in high school. I had a Pentium 4 computer which was pretty fast for its time. However, I had a dialip windows soft modem. You remember the driver situation. I had to boot to Windows to check my email.
Also, I was basically a child and had no idea what I was doing (I still don't but that's besides the point). Things have definitely gotten better.
I'm sorry, but no. I ran Slackware 96, Red Hat 4.2, Mandrake 5.0, a bunch of Ubuntus from 12.04 onward, and Fedora now. It is absolutely, qualitatively different now than it was at the turn of the century.
In the Red Hat 4.2 days, it was something that I was able to use because I was a giant nerd, but I'd never ever ever have recommended it to a normal person. By Ubuntu 12.04, 15 years later, it was good enough that I'd recommend it to someone who didn't do any gaming and didn't need to use any of the desktop apps that were still then semi-common. In 2026, it's fine for just about anyone unless you are playing particular (albeit very popular) games.
I made the move about a month ago to bazzite on my desktop with an nvidia graphics card. I still have my windows drive for when I need it but that's pretty rare. Bazzite isn't perfect but we've reached the point where the rough edges are less painful than the self sabotage microsoft has been inflicting on their users in recent versions of windows.
I tried bazzite but ended up on cachyos. The whole layered / immutable thing got a bit annoying. I'd rather just run snapshots and manage my packages more traditionally
I love the layered thing except for the rough edges. Unfortunately the rough edges for me are that Linux containerization and permissions are completely idiotic.
In Fedora Atomic it should be foolishly easy to set up a system account, with access to specific USB devices via group, and attach a volume that can easily be written to by a non-root user inside of the container.
I think we've reached a point where Windows is about as rough as Linux. But the problem is still that people are familiar with Windows and have learned how to deal with the roughness; not so on Linux. And so long as Windows owns the business and education sectors, it will always have the benefit of that familiarity.
I quit gaming a year ago and no longer have a consumer OS installed on any machine. I can't imagine ever willingly going back after getting used to being able to set my machine up any way I want, and know it will work exactly as I've specified, and won't ever spy on me or monetize my data, and actually has an ecosystem for extending it in basically any way I can imagine, with no bloatware, an app ecosystem with no bundled spyware or adware, etc.
Linux desktop is amazing. Coming from Debian, I installed Windows and had to quickly purge it from my hardware! Super bloated, slow, constantly phoned some CC center, automatically connected to OneDrive, …
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
Debian (stable) is great but I wouldn't use it for a gaming PC on modern hardware. The drivers included are just too old. Bazzite or Arch (DIY option) seem better options.
Debian Stable gamer here, with modern hardware, having a great time.
> The drivers included are just too old.
This can usually be fixed by enabling Debian Backports. In some cases, it doesn't even need fixing, because userland drivers like Mesa can be included in the runtimes provided by Steam, Flatpak, etc.
Once set up, Debian is a very low-maintenance system that respects my time, and I love it for that.
HDR still doesn't really work on Linux w/ nVidia GPUs.
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
I have Interstellar on 4K UltraHD Blu-ray that features HDR on the cover, Sony 4K Blu-ray player (UBP-X700) and a LG G4 OLED television. I also have an AVR (Denon AVR-S760H 7.2 Ch) connecting both the Blu-ray and a PC running Linux with a RTX 3060 12GB graphic card to the television. I've been meaning to compare HDR on Linux with the Blu-ray. I guess now better than never. I'll reply back to my post after I am done.
Try it with different monitors you have. The current nVidia Linux drivers only has BGR output for 10bpp, which works on TVs and OLEDs but not most LCDs monitors.
My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.
Television HDR mode is set to FILMMAKER,
OLED brightness 100%,
Energy Saving Mode is off.
Connected to AVR with HDMI cable that says 8K.
PC has Manjaro Linux with RTX 3060 12GB
Graphic card driver: Nvidia 580.119.02
KDE Plasma Version 6.5.4
KDE Frameworks Version: 6.21.0
Qt Version: 6.10.1
Kernel Version 6.12.63-1-MANJARO
Graphics Platform: Wayland
Display Configuration
High Dynamic Range: Enable HDR is checked
There is a button for brightness calibration that I used for adjustment.
Color accuracy: Prefer color accuracy
sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
Brightness: 100%
TV is reporting HDR signal.
AVR is reporting...
Resolution: 4KA VRR
HDR: HDR10
Color Space RGB /BT.2020
Pixel Depth: 10bits
FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.
For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled.
Color look completely washed out.
For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.
I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.
On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...
Resolution: 4k24
HDR: Dolby Vision
Color Space: RGB
Pixel Depth 8bits
FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data.
The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).
I would say the colors over all look better on the Blu-ray.
I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.
*Edit: Sorry Hacker News has completely changed the format of my text.
Funny how it went from "just get an Nvidia card for Linux" and "oh my god, what did I do to deserve fglrx?" to "just get an AMD card" and "it's Nvidia, what did you expect?"
They're also selling $3000 nVidia AI workstations that exclusively uses Linux. But what if you want to watch an HDR video on it? No. What if you want to use Google Meet on Chrome/Wayland? It's broken.
I don't think this is true. I can go into my display settings in kde plasma and enable HDR and configure the brightness. I have a nvidia blackwell card.
You can enable, yes. But (assuming you're on an LCD display and not an OLED), you're likely still on XRGB8888 - i.e. 8-bit per channel. Check `drm_info`.
Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.
The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.
Your Display Configuration
Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.
| Monitor | Connector | Format | Color Depth | HDR | Colorspace |
|------------------------|-----------|-------------|-------------|--------------|------------|
| Dell U2725QE (XXXXXXX) | HDMI-A-1 | ABGR2101010 | 10-bit | Enabled (PQ) | BT2020_RGB |
| Dell U2725QE (XXXXXXX) | HDMI-A-2 | ABGR2101010 | 10-bit | Disabled | Default |
* Changed the serial numbers to XXXXXXX
I am on Wayland and outputting via HDMI 2.1 if that helps.
EDIT: Claude explained how it determined this with drm_info, and manually verified it:
> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.
EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.
EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).
EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34
I don't have a Dell U2725QE, but on InnoCN 27M2V and Cooler Master GP27U there's no ABGR2101010 support. These monitors would only work with ARGB2101010 or XRGB2101010 which nVidia drivers do not provide.
HDR playback in chrome on KDE works as expected from what I can tell. For GNOME 49.2 it does not, it doesn't get the luminance that it should at this time. 49.3 may fix this.
I don’t think your problem is RGB instead of BGR. That’s just the compositor’s work area and your monitor never sees it (it includes an alpha channel). Have you tried KDE Plasma? It sounds like KWin uses 10-bit planes by default when available. Maybe Ubuntu’s compositor (Mutter?) doesn’t support 30 bit color or must be configured? Or maybe you need the nvidia driver >= 580.94.11 for VK_EXT_hdr_metadata (https://www.phoronix.com/news/NVIDIA-580.94.11-Linux-Driver)
It's not obvious how to interpret the output. I pasted it into chatgpt and it thinks I am using "Format: ABGR2101010" for both monitors (only 1 has HDR on) so I don't trust it.
After a few months of testing the waters, I just moved my gaming PC over to full-time Linux this weekend. Proton has really been revolutionary, as I haven't yet encountered something in my Steam library that won't work.
It is good, and for 99+% of use cases for 90+% of users (who mostly use nothing but the browser), they will hardly even notice a difference, besides the lack of obnoxious, instrusive MS behavior.
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
FUSE will provide Dropbox in a more integrated way than Windows (eg. terminal) and a cursory Google revealed some projects for Dropbox that do the JIT download you are after - they are old, but I wager still work just fine (an inactive project can just mean that it's complete).
I just switched to Linux. It's a great gig, and I'm actively encouraging everyone I know still infected with the malware known as Windows 11 to switch.
But some of the drawbacks really aren't edge cases. Apparently there is still no way for me to have access to most creative apps (e.g. Adobe, Affinity) with GPU acceleration. It's irritating that so few Linux install processes are turnkey the way they are for Windows/Mac, with errors and caveats that cost less-than-expert users hours of experimenting and mucking with documentation.
I could go on, but it really feels like a bad time to be a casual PC user these days, because Windows is an inhospitable swamp, and Linux still has some sharp edges.
I use OneDrive and Google Drive heavily and there just are not good clients for Linux for those that I have found. Especially with the ability to not sync files but still "look" like they are there in the filesystem. That is my main stopper now.
Whilst initially reluctant to - I have made a once off payment to Insync ( https://www.insynchq.com ) many years ago for my Google Drive account - and has worked flawlessly.
There are plenty. I run only Linux at home but CAD software for hobbies (Fusion 360), most games that want kernel level anti cheat, some embedded DRM-enabled media, all sort of just fail. Other things, like GPU tuning or messing with your displays/drivers are harder than they should be. My Bluetooth earbuds just don't work with my Linux machines.
I don’t think he did get it running. It’s one of my main blockers as well. Last time I tried I got as far as it starting up and logging in to their identity server via the browser, but the redirect back to the application didn’t work. Such a silly thing that prevents it from working. Why does a CAD program need to online auth, anyway? (I know the reason but it’s an annoying one)
I agree, I'd cut off dual booting and go full Linux when the hardware and software I use supports it. One of which being a PCIe Elgato capture card, another being an audio mixer with no driver support and the alternatives are very hacky and too complicated for me.
I permanently switched from Windows to Linux about five years ago. I had the same issue as you with Dropbox, so I switched to using the Maestral client for Dropbox instead which has support for selective sync. Works like a charm for me.
Just recently started using the desktop machine (under my desk, as opposed to my laptop which sits on my desktop) and put NixOS on it, and found myself pleasantly surprised. There's certainly still some parts of NixOS that require some expertise and getting your head around its package model, but overall I was surprised at how idiotproof it was to install and use. I mostly play games on it with Steam, which also Just Works.
NixOS is really a profound experience, once you embrace it. I used Arch for ~3 years and ended up reinstalling it maybe 15 times on my desktop alone. Switched to NixOS and I've used the same installation for 3 years, synced with my laptop and server, switching from x11 to Wayland to KDE to GNOME then back again with no problem.
It doesn't feel real sometimes. My dotfiles are modularized, backed up in Github and versioned with UEFI rollback when I update. I might be using this for the rest of my life, now.
I also have the same Arch install from 2014 on my main hardware. Each replacement computer is nothing more then taking the old drive out, placing it into an USB enclosure, booting a USB live, setting up the partitions on the new drive, and _rsync_ the content from the old to the new, finalizing with registering the UEFI boot loader.
One just need to make sure that you use the proper _rsync_ command options to preserve hard links or files will be duplicated.
I personally remember being inspired by Erase your Darlings and Paranoid NixOS Setup back in the day, less for the hardening measures and more because of how great the Nix syntax looked. Huge, monumental ass-pain setups could be scripted away in one or two lines like it was nothing. You could create wildly optimized configurations for your specific use-case, and then divide them into modules so they're portable.
It's not advisable to switch to one of these paranoid configurations outright, but they're a great introduction to the flexibility provided by the NixOS configuration system. I'd also recommend Xe's documentation of Nix Flakes, which can be used on any UNIX-like system including macOS: https://xeiaso.net/blog/nix-flakes-1-2022-02-21/
For what it's worth: I no longer suggest the use of NixOS for any purpose. I only have one NixOS system in my house because it's my NAS and I am a coward.
I've been on Linux desktop for ages, but it's not quite stable enough that I can recommend it to anyone. Space Marine 2 was the first game in quite a while than didn't just work out of the box, but...
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
It might depend on the distro, but were you running a 10 series or earlier? They dropped Pascal and earlier CPUs with the v590 driver, I know Arch migrated what the nvidia package installed in such a way that could leave someone without an appropriate driver unless they manually moved to a different source.
Then again Arch is one of those distros that has the attitude that you need to be a little engaged/responsible for ongoing maintenance of your system, which is why I'm against blind "just use (distro)" recommendations unless it's very basic and low assumptions about the user.
I've had mixed experiences with AMD. Back in the day - a bit after Linus told Nvidia to fuck off - I tried to get my Radeon 5850HD (i think?) working on Ubuntu. It was one of those things I spent the whole weekend (OS reinstalls really add up) trying to make work, to no avail. Relative to that nonsense, the equivalent proprietary Nvidia driver just worked after being installed.
A couple of months ago I bought a second hand RX 7800 XT, and prepared myself for a painful experience, but I think it just worked. Like I got frustrated trying to find out how to download and install the driver, when I think it just came with Linux Mint already.
I've been using a full amd build with arch on it for years now. never had graphics related issues after an update. my biggest gripe is with the hdmi organization and how we can't have proper support with open source drivers.
A long time ago when I was in University, I was a volunteer in the Ubuntu group. In addition to evangelizing Linux/OSS, We were trying to convince our University to switch to opensource software for at least some engineering education with only a little bit of success.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
It’s bad for society for the desktop OS market to be a proprietary monopoly. It basically allows Microsoft to extract rent from the public defender.
I do understand the evangelism being obnoxious. I don’t advocate for people to switch if they have key use cases that ONLY windows or OS X can meet. Certainly not good to be pushy. But otherwise, people are really getting a better experience by switching to Linux.
When you (try to) use libre software, the problems you run into tend not to be related to insufficient engineering, but more societal and economic, where they would be less likely to appear if there were more people in your cohort.
Examples:
- An important document is sent to me in a proprietary format
- A streaming service uses a DRM service owned by a tech giant that refuses to let it work with open source projects
- A video game developer thinks making games work on Linux isn't worth getting rid of rootkit anticheat
The downside is Windows users would have to live in a world without subscription-based office suites, locked down media, and letting the CCP into your ring 0.
> why is it that the Linux people seem to be so needy for everyone to make the same choices they make?
This is the sort of question an apolitical person would ask a liberal (I am aware liberalism had been tainted in the recent times), like why is it you people are so needy and constantly preaching about democracy?
I moved to linux this month for good once i realized I no longer needed microsft services (Excel for example "runs on Mac" but is missing important features). I chose redhat because its what I've been using for over a decade at work and feels like home. Only thing I miss is Capcut as that workflow was pretty ironed out. Getting the hang of KDENlive
Been so happy with my switch to Linux about 8 months ago. The nvidia gremlins that stopped me in prior years are all smoothed out.
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
I’d argue the majority of casual online PC discourse is driven by gaming. By the numbers LTT is the largest PC/IT/consumer computer YouTube channel and the majority of their content is focused on gaming.
I'm glad that I am not the only one saying this. I made the switch 20+ years ago for my day to day use, and I have rarely experienced any problems with it.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
These seem annoying, but I'd argue that these problems are in some ways less significant on Linux than on Windows. If some function of Windows is broken or unsatisfactory, there is not necessarily a way to fix it.
But you can adjust your own system. It'd be unhelpful of me to suggest to an unhappy Windows user that they should switch to another operating system, as that demands a drastic change of environment. On the other hand, you're already familiar with Linux, so the switching cost to a different Linux distribution is significantly lower. Thus I can fairly say that "Ubuntu getting worse" is less of a problem than "Windows getting worse." You have many convenient options. A Windows user has fewer.
What are you talking about? Firefox hasn't been single process since more than 10 years ago. At most, it uses 7% for the main process and I have thousands of tabs open. I can't talk about the other two, but I've had processes use 60% of the system memory without problem (everything else is slow due swapping, but that's expected).
I've been really enjoying my experience using CachyOS on my (formerly Windows) gaming PC. I chose to use Limine and btrfs so now if it gets borked by a bad package install/uninstall I can roll back pretty easily. My next step is to replace my Nvidia GPU with an AMD one so I can stop worrying about that aspect in the future.
I am using Linux in different flavours for the past 10 years. It has become more reliable for the the time. The last 5 years had noticeable few issues across the distros.
Linux desktops have felt flaky for me for a few years now. I’m trying to figure out how much of that is bad choices vs real problems.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
The only real obnoxious slow-down daemons I'm familiar with are the "system indexing" things (GNOME Tracker, KDE Baloo) -- highly recommend disabling them.
I've been using Kubuntu for years with good results. I prefer KDE to Gnome, which Kubuntu takes care of, and I normally add in the flatpak repositories so I don't need snap. That has generally worked well for me in the last 5 years.
For certain timeperiods I have needed to switch to Fedora, or the Fedora KDE spin, to get access to more recent software if I'm using newer hardware. That has generally also been pretty stable but the constant stream of updates and short OS life are not really what I'm looking for in a desktop experience.
There are three issues that linux still has, which are across the board:
- Lack of commercial mechanical engineering software support (CAD & CAE software)
- Inability to reliably suspend or sleep for laptops
- Worse battery life on laptops
If you are using a desktop and don't care about CAD or CAE software I think it's probably a better experience overall than windows. Laptops are still more for advanced users imho but if you go with something that has good linux support from the factory (Dell XPS 13, Framework, etc.) it will be mostly frictionless. It just sucks on that one day where you install an update, close the laptop lid, put it in your backpack, and find it absolutely cooking and near 0% when you take it out.
I also have never found something that gave me the battery life I wanted with linux. I used two XPS 13's and they were the closest but still were only like 75% of what I would like. My current Framework 16 is like 50% of what I would like. That is with always going for a 1080p display but using a VPN which doesn't help battery life.
We live in a world with the internet and distributed version control, so essentially every piece of software in the world has a tradeoff where the people maintaining it might push an update that breaks something at any time, but also those updates often do good things too, like add functionality, make stuff more efficient, fix bugs, or probably most crucially, patch out security vulnerabilities.
My experience with FOSS has mostly been that mature projects with any reasonable-sized userbase tend to more reliably not break things in updates than is the case for proprietary software, whether it's an OS or just some SaaS product. YMMV. However, I think probably the most potent way to avoid problems like this actually ever mattering is a combination of doing my updates manually (or at least on an opt-in basis) and being willing to go back a version if something breaks. Usually this isn't necessary for more than a week or so for well-maintained software even in the worst case. I use arch with downgrade (Which lets you go back and choose an old version of any given package) and need to actually use downgrade maybe once a year on average, less in the last 5
> Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
No, not really. A Linux desktop with a DE will always be slower and more brittle than an headless machine due to the sheer number of packages/components, but something like Arch + Plasma Shell (without the whole KDE ecosystem) should be very stable and snappy. The headaches caused by immutable distros and flatpaks are not worth it IMO, but YMMV.
With debian and KDE (both personal preference), but no snap or flatpak, it works wonderfully. Power/sleep-management has become better than a default windows install. All hardware, including the fingerprint sensor, just works.
Not really, no. What did you install that slowed things down?
> If yes, what actually works long-term?
Plain ordinary Ubuntu 24.04 LTS, running on an ancient Thinkpad T430 with a whopping 8GB of RAM and an SSD (which is failing, but that's not Linux's fault, it's been on its way out for about a year and I should probably stop compiling Haiku nightlies on it).
Can you give an example of which desktop apps are "dragging in daemons"?
I've run Void Linux + Xmonad for many years without any such issues. I also recently installed CachyOS for my kid to game on (KDE Plasma) and it works super well.
I'm slowly de-Microsofting my computing. I've traded OneDrive for Syncthing. I ditched one PC for a Mac. I have the technical skills to run Linux effectively, but the biggest obstacle for my Linux adoption is distro fatigue. Run Ubuntu? Debian? Fedora? PopOS? Kubuntu? Arch? The article introduced yet another one to consider--Bazzite.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
As a beginner, just pick Ubuntu and get on with your life imo. Switching distros isn't that big of a lift later on and pretty much everything you learn carries over from one to the other. It's much more worthwhile to just pick _something_ and learn some basics and become comfortable with the OS imo.
Pick a popular distro, and during installation, put your /home directory on its own partition. This way, you won't have much to reconfigure if you ever have a reason to switch distros. (You might not ever have a reason; they're all pretty capable.)
I think fragmentation is the wrong way to look at it; they're all basically compatible at the end of the day. It's more like an endless list of people who want to min-max.
Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.
Once you use it enough to develop opinions, the huge list of options will thin itself out.
Ubuntu stopped caring about the desktop experience when the switched to Gnome. Now they have annoying SNAPs. They are a business and they are going to continue enshitifying it.
I haven't tried Bazzite because I'm not into gaming but Linux Mint is working very well for a lot of people coming from Windows. It just works and has great defaults. Windows users seem to pick it up pretty easily.
Also, Linux Mint upgrades very well. I've had a lot of success upgrading to new versions without needing to reinstall everything. Ubuntu and other distros I've tried often have failed during upgrading and I had to reinstall.
Every year at around this time there is a lot of linux related content in tech media.
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
I've been working on the Linux desktop for 20 years, and I've been using it on the desktop since 1999, so I lived through the infamous "Year of the Linux Desktop" era.
I've not seen anything like the current level of momentum, ever, nor this level of mainstream exposure. Gaming has changed the equation, and 2026 will be wild.
Not just gaming. This year, both Windows and Mac OS had absolutely terrible years. The Mac effed up its UI with liquid glass, to the point where Alan Dye fled to Meta. Microsoft pushed LLMs and ads into everything, screwing up what was otherwise a decent release.
On the other hand, on the Linux side, we had the release of COSMIC, which is an extremely user-friendly desktop. KDE, Gnome, and others are all at a point where they feel polished and stable.
To be honest, I always figured we'd make it in the long run. We're a thrifty bunch, we aim to set up sustainable organizations, we're more enshittification-resistant by nature. As long as we're reliable and stick around for long enough.
I don't think the prevalence of these articles this time of year is because the authors go on holiday, but instead is because the new year is the perfect time to ponder: "Will this be the year of the Linux desktop?"
I guess everyone’s in a “fuck it I’m ready to try some new stuff” mood too so this content is perfectly suited for new years. Would never have noticed this without your comment.
Except every year you didn't have people like Pewdiepie and DHH pushing Linux. As as channels like GamersNexus doing Linux benchmarks. At the same time Windows and Mac making very dumb mistakes. So this time it does feel different, even if it might not be in the end.
I use a Linux PC every day but I wouldn't recommend it to normal people. They're not going to feel any renewed sense of ownership from it, just annoyance at having to think about technical gibberish when they just want to get on with using the computer.
Yeah, because getting ads and pushed to use more Microsoft products all day long isn't an annoyance when they just want to get on with using the computer.
I’ve been around the block with Linux distributions since 2020. I personally think that Bazzite is the way to go for most people coming from Windows, or people experienced with Linux that want something as close to “set and forget” as you can.
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
If Microsoft could get their heads out of their rears, they could potentially get back to a better OS for gaming. The hybrid kernel Dave Cutler designed is in many ways still better than the Linux kernel. It's the userland that is the issue with Windows 11. Look just by enabling true nvme support you close the gap between Linux and Windows performance wise.
I have been switching between linux-windows for a while now, and i think 2026 is not the year of linux for now.
Linux still suffer from the same fragmentation issue: Oh you want to play game, you should use distro X, oh you want an average web-browsing, working, you should use distro Y, or for programming, use Z. Of course all of them can do what other can do, but the community decided that the way it is.
Yesterday i read a reddit thread about an user sharing his issue with pop-os, and most(if not all) comments saying he is using the wrong distro. He is using latest release (not the nightly build), which is a reasonably thing to do as new user.
Not sure if Linux Mint has changed this, but i remember having to add "non-free" repo to use official Nvidia driver. Not a big deal to people who know what they are doing, but still, that is unnecessary firction.
I just got a laptop for Christmas (first thing I've bought for myself in a good while) with 64GB of DDR5 RAM, a video card inside of it, AMD Ryzen 7 CPU, AMD Radeon 6550M. 144hz screen.
Not the best, but works for me.
I put CachyOS on it, using Steam just run the game's installer adding it as a game to your library -- you just select which proton you want (cachyos-proton) as a dropdown in the Properties in the Steam library. that's it.
it's lightweight, arch (I ditched manjaro), runs KDE and games perfectly, cursor IDE runs great, VMS run great.
first thing I did when I got it from fedex was remove Windows and put Linux on it. I thought 'maybe I'll just bite the bullet and sign up a Microsoft cloud account to be able to access ..my desktop' and 1/4 through its install I held the power button and popped a flash drive in. just say no to windows and you'll all be happy, trust me.
the only effort it required was for me to say f this on using Lutris and just use Steam as the wrapper.
2026 is definitely the year for linux. every year is. valve heavily invested in Arch, proton, and is using Linux on their devices and honestly: Windows is spyware, and after their vibe coded jank 25H2 update that broke a ton of things and Windows 10 being EOL, I hope more people get to enjoy throwing Ventoy on a USB stick with a bunch of linux isos copied over to it and boot and play with what they love.
so I disagree, 2026 is the year for Linux, and Linux is love.
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
I have always wanted to use linux as my main OS. I tried with Ubuntu twice the past and always ran into really painful hurdles or missing features. This year I tried again with Mint and it absolutely stuck the landing. I have completely switched my desktop and laptop (and plex server) to mint. I have never even booted back into windows. I have not had any big issues and have been able to make it better than my windows desktop ever was.
I've been sceptical of the 'Linux desktop' for a long time, but I recently started using Bazzite on my gaming PC and I'm super impressed. In just a few years since I last daily drove a Linux distro it's come such a long way. KDE Plasma is fast and beautiful.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
There is a strange, but pleasant feeling when you hear someone claiming “they’re early to Linux” and think it’s going to be something big. (Happened recently.)
We've made the switch and it's been great. On top of my that my partner who is not a computer person picked up Linux Mint to the level she can use Windows in a couple weeks.
Linux has been good for years. The only thing that's changed is that Valve put a bunch of effort into Proton so now Linux has enough game titles for that to no longer be an excuse to not switch.
I've been using Linux full-time (no other OSes at all) for nearly 20 years. Went through all my university education using only Linux. It's problem free if you use it like a grandma would (don't mess with the base system) and even if you mess with it, most things are easily reversible.
That being said, I have noticed that the newfound interest in Linux seems to be a result of big tech being SO abusive towards its customers that even normies are suddenly into computing "freedom".
What is really blocking the move for me is zScaler, Zoom (they may exist on Linux, not sure about how integrated they are) but especially Outlook (the client). The OWA version is subpar and without it I cannot function in a work environment.
> without it I cannot function in a work environment.
This is more about what you choose as your operating environment, not what your work imposes as your working environment.
Most places of work, mine included, run Microsoft services that lock them into the ecosystem incredibly tightly.
As per the article title, "if you want to feel like you actually own your PC", this is about your PC, not the one provided to you by your workplace (since it's likely owned by them).
One thing I'm worried about in my work environment is Microsoft enforcing the web versions of Office and deprecating the stand alone desktop applications. The web versions are a massive step down in terms of functionality and ease of use. Your mention of OWA makes me feel as if that is what Outlook will be sacrificed for at some point in the future anyway.
I had a similar issue, but I ended up installing Debian and running Windows 10 as a virtual machine with VirtualBox. The webcam can be accessed as if were installed on the guest OS and haven't had a problem with Zoom or Teams. Just sharing in case it helps.
Linux is my main and sole desktop since around-2006. I needed windows for TurboTax a few hours a year in the past but that's it, I did not do PC games though, just regular desktop stuff including developing code.
Hahaha. Try sharing a couple old printers and scanners connected to a Linux box on your home network. At best, when it’s working you get lowest common denominator functionality. Want to run some vms ? Works great until you update your distro and the vm hosts kernel modules aren’t compatible anymore. Oh, want to use a later version of some package like docker? Did I use apt or snap or flatpack???
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
Sounds ultra-specific to your experience. VMs, package management and networking are all things that macOS and Windows stumble with for regular usage. I've used all three OSes professionally, and Linux requires the least configuration to get work done.
I love this. I spent my holidays hearing non-technical family members complain about their ever deteriorating Windows experiences, issues that make me righteously angry at Microsoft.
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
Photoshop (for example) largely works in Wine, although it's not stable enough for production usage. The problem is the CC itself and the installer, which is unimaginably bloated and glued to the Internet Exp... I mean Edge Web View and many other Windows-only things.
Yes. The reason the year of the Linux desktop has yet to arrive is because most people don't understand this joke. Linux is powerful because it is made for power users (although certain distros are changing this)
I switched in 2020. I run Fedora and Arch. I don’t miss MacOS at all. The last Windows I used was 8, so my opinion is out of date, but yeah… I don’t miss Windows, either.
cachyos is a good os that is also performant. arch though so there are quirks around the rolling update model but you always have the newestish packages and if you update regularly there seems to be less headache.
I tried a number of distros and settled on Omarchy because it has a coherent design and nice aesthetics, but it has some weird quirks about messing with my dotfiles on updates. It's so new I suspect this will be ironed out soon.
Honestly I loved it a lot more pre-2022, when Ubuntu added a super aggressive OOM killer that only operates on the level of an entire systemd run unit. Meaning that if you are running computation in, say, a shell and one for your subprocesses running computation takes too much memory, it takes out the entire shell and terminal window, leaving no trace of what happened, including all the terminal logs.
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
I have to wonder if Ubuntu's prescriptive stance on things like this is becoming increasingly outdated in an age where there's actually a decent experience out of the box for a lot more stuff on Linux. I've long since moved on from using it personally for my devices, but I'm fairly certain my tolerance for spending effort tinkering to get things working like I want is a lot higher than even most Linux users, so it's hard for me to gauge if the window have moved significantly in that regard for the average Linux user.
It's not just Ubuntu, Arch is just as bad. The primary problem is systemd, which provided an adequate OOMd for daemons, but then all the distributions seem to be using it for interactively launched processes
If anybody can help me out with a better solution with a modern distribution, that's about 75% of the reason I'm posting. But it's been a major pain and all the GitHub issues I have encountered on it show a big resistance to having better behavior like is the default for MacOS, Windows, or older Linux.
It's funny how you say the way it used to be was better when people always complained about the OOM killer waiting until the system had entirely ground to a halt before acting, to the point some preferred to run with 0 swap so the system would just immediately go down instead.
Regardless, I believe EarlyOOM is pretty configurable, if you care to check it out.
I find it interesting how many people have Ubuntu in mind when it comes to a Linux desktop when it hasn't been a great experience ever since they switched to Gnome. They don't really care about the desktop anymore. They are now a corporation that is enshitifying their product with things like SNAPs.
If you want a distro that really cares about the desktop experience today, try Linux Mint. Windows users seem to adapt to it quite quickly and easily. It's familiar and has really good defaults that match what people expect.
This is really annoying me as well. I use a program for work that can occasionally use a lot of ram, while saving or interpolating for example. On my little MacBook Air with just 8GB of ram everything works fine, it just swaps a whole lot more for a short period. On my desktop with 16GB ram and Ubuntu oom just kills it, my workaround is the swapspace package which adds swap files under high load, works so far.
It sounds like your primary issue is that you have a severe RAM deficiency for what you're trying to use your machine for. Any OOM killer, be it the kernel's per-process one or systemd-oomd's per-service one, only exists to try to recover from an out-of-memory scenario where the alternative is to kernel panic (in the case of the kernel's oom killer) or for the system to completely lock up (in the case of systemd-oomd).
My primary issue is that a system that did an OK job at dealing with low memory situations has been replaced with a completely inadequate system.
If your solution is "don't ever run out of memory" my solution is "I won't ever use your OS unless forced to."
Every other OS handles this better, and my work literally requires pushing the bounds of memory on the box, whether it's 64GB or 1TB of RAM. Killing an entire cgroup is never an acceptable solution, except for the long-running servers that systemd is meant to run.
As far as I know, Windows just grinds to a halt entirely, system processes start crashing, or you get a BSOD, and mobile OSes kill the app without any trace. I never had an OOM situation on Macs so I don't know about macOS.
Windows is unstable even if you have more than enough memory but your swap is disabled, due to how its virtual memory works. It generally behaves much worse than others under heavy load and when various system resources are nearly exhausted.
There are several advanced and very flexible OOM killers available for Linux, you can use them if it really bothers you (honestly you're the first I've seen complaining about it). Some gaming/realtime distros are using them by default.
Even NT4 handled OOM scenarios better than modern Linux. No, it didn't grind to a halt, it would grind the rust off of the spinning platters. But it would continue to run your applications until the application was finished or you intervened.
The kernel OOM killer has never done an adequate job for me. It tends to hesitate to kill anything until the system has literally been completely 100% unresponsive for over half an hour. That's completely unacceptable. Killing a cgroup before the system becomes unresponsive is a million times more desirable default behaviour for a normal desktop system (which Ubuntu Desktop is).
Of course, if it's absolutely not compatible with your work, you can just disable systemd-oomd. I'm wondering though, what sort of work are you doing where you can't tune stuff to use 95% of your 1TB of memory instead of 105% of it?
What would that be useful for? A properly implemented software passkey like Keepassxc would be secure against anything short of a local root exploit. A TPM would not really help against that either.
An ex lease Thinkpad T Series with Intel graphics is a good choice for value and compatibility. eg a T490 or T14 era machine.
Using hardware at least 6-12 months old is a good way to get better compatibility.
Generally Linux drivers only start development after the hardware is available and in the hands of devs, while Windows drivers usually get a head start before release. Brand new hardware on a LTS (long term support) distro with an older kernel is usually the worst compatibility combo.
The personal desktop has fallen in relevance enough for that to be possible. The goalposts moved, now linux needs to have phone, tablet, and laptop with smooth effortless integration between them all.
I recently switched to using a thumb drive to transfer files to and from my phone/tablet, I became demoralized when faced with getting it all setup.
KDE has phone and laptop integrated well enough for me. It's worth giving it a try but the more devices you want integrated the more of a risk it is in case it doesn't quite work right. But I've got enough other devices in the house which I can't put KDE on (work laptop, Windows machine I need for some specific software) that I can recommend https://github.com/9001/copyparty over thumb drives.
I actually intended to set everything up but did not have time and needed to copy some files, so dusted off a thumb drive. I am liking it quite a bit and I think I prefer it to the alternatives.
YOTLD has nothing to do with my needs and wants and I am perfectly happy with my thumb drive and the weird little ways linux imposes itself on my life.
Linux is not suitable for the average user. I use Xubuntu on all my old computers, but I am 100% sure a normie would not tolerate the tedium of it. People want shiny icons with animations and a bunch of garbage on their computers to make them feel they are doing something. Linux is too static for that.
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
I wouldn't recommend Xubuntu for the average user. What you feel is about Xubuntu, not Linux. Normies are doing well adapting to Linux Mint. It's easy for Windows users to get used to within a few days and it has sane defaults that match what users expect. It just works.
This is just not true anymore. The only things that don't work anymore are a few AAA titles that use particular types of anti-cheat systems that rely on Windows kernel drivers (League of Legends is one that comes to mind).
If I remember correctly, after the Crowdstrike BSOD-all-windows-instances update last year Microsoft wanted to make some changes to their kernel driver program and these anti-cheat measures on Windows might need to find a new mechanism soon anyway. That's a long way of saying, it's plausible that even that last barrier might come down sooner rather than later.
Spoken like a true Windows UX aficionado. Who doesn't love multiple system settings apps, a mix of minimal new context menus and overcrowded legacy context menus just one more click away.
Only true if those inconsistencies actually matter to your workflow. Not going to deny that they exist, obviously, but their impact is largely overplayed (and gratuitously downplayed on Windows, in my experience).
Unless one has a rack of older GPU hardware that uses an abandoned EOL NVIDIA kernel driver difficult to install past kernel 6.12.x Then one faces the harsh reality of Windows users rightfully laughing at a perpetually Beta Linux OS, as Win11 still boots with the older drivers. while the dkms build randomly implodes at some point.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
What amazes me is that on Steam they no longer make the distinction (in the standard library view) between Windows and Linux: every game is assumed to launch in Linux, using Proton behind the scenes it needed. There's still a "Linux games" toggle but now every game appears ungrayed by default.
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
Can I run Solidworks on Linux yet? Excel? Labview? Vivado? Adobe products? Altium Designer? (Matlab is mostly yes) Not everybody is just writing Javascript and PHP.
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
> Can I get a laptop to sleep after closing the lid yet?
> on windows all of this just works
Disagree on the sleep one - my work laptop doesn’t go to sleep properly. The only laptop I’ve ever used that behaves as expected with sleep is a macbook.
Not sure why you're insinuating that I dislike apple products. My personal mb air doesn't have this issue and most of my household is on apple.
I'm also seeing results for "macbook pro doesn't go to sleep when lid closed", so other people see this problem too. You can't really claim that other platforms have them beat here if there isn't data to support the claim.
Macs do sleep well, when they manage to sleep. Sometimes macOS takes issue with certain programs, the last stack I used at work had a ~50/50 chance of inhibiting sleep when it was spun up.
All in all, I've given up on sleep entirely and default to suspend/hibernate now.
A buggy program preventing sleep is a bug in that program, not a mark on the overall support and reliability of sleep functionality in macOS.
There are valid reasons why a program might need to block sleep, so it's not like macOS is going to hard-prevent it if a program does this. Most programs should not be doing that though.
Still no big CAD names that I'm aware of (annoyingly), Libre Calc works fine for me as an Excel alternative, I have used Matlab on it but not recently, not sure on the others.
Laptop sleep and suspend can still be finicky unfortunately.
I will say my experience using CAD or other CAE software on windows has gotten progressively worse over the years to the point that FEA is more stable on linux than on windows.
We do really need a Solidworks, Creo or NX on linux though. My hope has been that eventually something like Wine, Proton, or other efforts to bring windows games to linux will result in us getting the ability to run them. They are one of the last things holding me back from fully moving away from windows.
These are all pretty niche products at this point. For the true professionals that need these tools they're stuck but most people can find reasonable alternatives for their hobby or side hustle.
I hear you, and also value Excel and a few other products, but I hit my perosnal limit with Windows enshittificatoion early last year and changed my daily driver at home to Linux.
I added a couple VMs running windows, linux, and whatever else I need in proxmox w/ xrdp/rdp and remina, and it's really the best of both worlds. I travel a good deal and being able to remotely connect and pick up where I left off while also not dealing with windows nagware has been great.
I would be 100% off Windows if it weren’t for Adobe Suite and Ableton Live not being ported to Linux. I’m guessing both of these companies are avoiding it not for technical reasons but because Linux is a support nightmare given all of the distros and variations of the platform.
What makes Linux a viable desktop for so many people now is the fact that they don’t need to run very much software anymore. It runs Chrome so you’re good.
Tried to switch to Linux plenty of times over the past few decades, this year it finally stuck. I can confidently say I’ll never install Windows again. Everything pretty much just works and any issues I’ve had have been quickly resolved with the help of LLM’s.
I've been giving Linux a go as a daily driver for a few months.
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
Sadly the project feels semi-abandoned. No new releases so I had to build it from source. Also the PR board seems to be ignored (one or two of those are mine - I tried to fix the misleading button labels).
I get people are tired of Year of Linux on Desktop, but I feel like last year it actually started happening for real. Mostly due to Arch which is not what I ever expected.
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
My main joy of Linux is to have tilling manager and to have same machine on which I can both play games and work. Which since Windows I couldn't make happen.
I have a Windows 11 PC strictly for gaming. Nearly every-time I interact with Windows it infuriates me with garbage code, Microsoft business BS and anti-privacy. I’d love to switch but has Linux gaming solved the anti-cheat requirement issue? Do Epic and EA games work on Linux?
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
anticheat is not a linux issue, its a developers issue.
it seems facially easy to solve. pair players with the type of game they want.
turn on anticheat if you want to join no cheat sessions.
if you want a cheat game turn off anticheat and you join sessions with other cheat players.
the whole dilemma comes out of malignant users that enjoy destruction of other users ability to enjoy the game.
go nuclear on clients that manage to join anticheat sessions with cheats turned on.
It is interesting and fascinating to see the growth of Linux.
As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
The article's title - and the original title of the submission - was specific, bold, and contained a call to action. The new title is bland and unspecific (Linux has been "good" for servers for decades now).
Please revert this submission to use the correct title.
It's good until you boot your system and end up with an unrecoverable black screen that meeses your day of work for no good reason. Linux is free if you don't value your time.
You can't really make blanket statements like this about "Linux" in general because it depends on what distro you use. For example, in NixOS to fix this type of problem all you have to do is rollback to a previous configuration that is known to work. I've not used it, but I believe Arch has something similar.
Even with imperatively configured distros like Ubuntu, it's generally much easier to recover from a "screen of death" than in Windows because the former is less of a black box than the latter. This means its easier to work out what the problem is and find a fix for it. With LLMs that's now easier than ever.
And, in the worst case that you have to resort to reinstalling your system, it's far less unpleasant to do that in a Linux distro than in Windows. The modern Windows installer is painful to get through, and then you face hours or days of manually reinstalling and reconfiguring software which you can do with a small handful of commands in Linux to get back to a state that is reasonably similar to what you had before.
I spent years (maybe a decade) without seeing them in the Windows 7 and early 10 era, but in the last few years I have them sometimes. Many seem Nvidia-related, but I also remember some due to a bad update that broke things in some laptops.
I dunno, I spend less time fighting with any of my several linux systems than the macbook I'm required to use for work, even without trying to do anything new with it. I choose to view this charitably and assume most of the time investment people perceive when switching operating systems is familiarity penalties, essentially a switching cost. The longer this remains the case, the less charitably I'm willing to view this.
You can also mitigate a lot of the "familiarity penalties" by planning ahead. For example, by the time I made the decision to switch from Windows around 15 years ago, I'd already been preferring multi-platform FOSS software for many years because I had in mind that I might switch one day. This meant that when it came time to switch, I was able to go through the list of all the software I was using and find that almost all of it was already available in Linux, leaving just a small handful of cases that I was able to easily find replacements for.
The result was that from day 1 of using Linux I never looked back.
Of course, MS seems to enjoy inflicting familiarity penalties on its established user base every couple of years anyway. After having your skills negated in this way enough times, the jump to Linux might not look so bad.
Not in my experience. I've run both Windows and Linux for the last decade and Windows is the only OS that I ever have problems with updates wasting my time and breaking things. I've been running image-based Linux for the last two years and the worst case is rebooting to rollback to the last deployment. Before that it was booting a different btrfs snapshot.
Fun aside: I had a hardware failure a few years ago on my old workstation where the first few sectors of every disk got erased. I had Linux up and running in 10 minutes. I just had to recreate the efi partition and regenerate a UKI after mounting my OS from a live USB. Didn't even miss a meeting I had 15 minutes later. I spent hours trying to recover my Windows install. I'm rather familiar with the (largely undocumented) Windows boot process but I just couldn't get it to boot after hours of work. I just gave up and reinstalled windows from scratch and recovered from a restic backup.
Windows has recently been a complete shitshow - so even if Linux hasn't gotten any better (it has) it is now likely better than fiddling around with unfucking Windows, and Windows doing things like deleting all your files.
You can put some work into windows to slim it down some, a unattended generator to turn most of the crap off on install, then Shutup OO goes a long way
There's an ever growing list of things to do in order to fix Windows, and that list is likely longer than Linux. This whole "your time is free" argument hinges on Windows not having exactly the same issue, or worse.
(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
For some reason, the Lenovo Legion S's Windows still comes with a lot of baggage and background services etc.
[0] https://sandstorm.io/news/
I am just using dm-snapshot for this -- block device level, no fancy filesystems.
Also, amazing house, my friend is enamored of the cat-transit. I used to live not too far from you :)
I'd had effectively zero issues avoid snaps.
..edit.. I installed a dummy package that displaces the nagware about the pro version too so I never get those messages during apt update any more.
Taking a quick definitely incomplete look I see at least:
/etc/apt/preferences.d/mozilla.pref
/etc/apt/preferences.d/nosnap.pref and removed ubuntu-pro-esm-apps and ubuntu-pro-esm-infra from that same dirbut also there is a mozillateam ppa in sources.list.d, and I don't see any installed package name that looks like it might be that dummy ubuntu-pro-esm thing, so maybe it got removed during a version upgrade and I never noticed because ubuntu stopped that nonsense and it isn't needed any more? Or there is some other config somewhere I'm forgetting that is keeping that hole plugged.
Anyway, it WAS a little bit of fiddling around one day, but at least it was only a one and done thing so far.
I kind of expected to be off of ubuntu by now because once someone starts doing anything like that, it doesn't matter if you can work around it, the real problem is that they want to do things like that at all in the first place. Well they still want what they want and that problem is never going away. They will just keep trying some other thing and then some other thing. So rather that fight them forever, it's better to find someone else who you don't want to fight. I mean that's why we're on Linux at all in the first place right? But so far it's been a few version bumps since then and still more or less fine.
One annoying thing is that linux cant run many different GPU drivers at the same time, so you have to make sure the cards work with the same driver.
Properitary 3rd party multi seat also exist for Windows, but Linux has built in support and its free.
https://en.wikipedia.org/wiki/Multiseat_configuration
I am super curious about your setup. I played with MS years ago, but I lost the need. It is a super cool tech that I'd love to see its efficiencies embraced in some way.
I've also said it here before but I will just give up on PC gaming wholesale before I go back to Windows. It's crazy how much gaming on Linux has improved in just the past couple years.
No, just because the Steamdeck's distro is built on Arch, and so you can piggyback on what they are doing.
I download the nvidia drivers directly from nvidia. Their installer script is actually pretty decent and then I don't have to worry about whether the distro packages are up-to-date.
Pretty horrible technology, and unfortunately a good majority of the gaming industry by revenue relies on it.
DP1.4 though, so you're still going to need compression.
https://trychen.com/feature/video-bandwidth
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
https://www.techpowerup.com/335152/china-develops-hdmi-alter...
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
The best Valve could do is offer a special locked down kernel with perhaps some anticheat capabilities and lock down the hardware with attestation. If they offer the sources and do verified builds it might even be accepted by some.
Doubt it would be popular or even successful on non-Valve machines. But I'm not an online gamer and couldn't care less about anticheats.
For competitive gaming, I think attested hardware & software actually is the right way to go. Don’t force kernel-level malware on everyone.
Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.
It's a bit like complaining that these days people just want to watch TV, instead of writing and performing their own plays.
I could almost get on board with the idea of invasive kernel anti-cheat software if it actually was effective, but these games still have cheaters. So you get the worst of both worlds--you have to accept the security and portability problems as a condition for playing the game AND there are still cheaters!
the bloggers/journalists calling it malware is doing the conversation a disservice. The problem is only really the risk of bugs or problems with kernel level anti-cheat, which _could_ be exploited in the worst case, and in the best case, cause outages.
The classic example recently is the crowdstrike triggered outtage of computers worldwide due to kernel level antivirus/malware scanning. Anti-cheat could potentially have the exact same outcome (but perhaps smaller in scale as only gamers would have it).
If windows created a better framework, it is feasible that such errors are recoverable from and fixable without outages.
FPSs can just say 'the console is the competitive ranked' machine, add mouse + keyboard support and call it a day. But in those games cheaters can really ruin things with aimbots, so maybe it is necessary for the ecosystem, I dunno.
Nobody plays RTSs competitively anymore and low-twitch MMOs need better data hiding for what they send clients so 'cheating' is not relevant.
We are at the point where camera + modded input devices are cheap and easy enough I dunno if anti-cheat matters anymore.
Competition vs other human beings is the entire point of that genre, and the intensity when you’re in the top .1% of the playerbase in Overwatch/Valorant/CSGO is really unmatched.
Case in point from a few years back - Fall Guys. Silly fun, sloppy controls, a laugh. And then you get people literally flying around because they've installed a hack, so other players can't progress as they can't make the top X players in a round.
So to throw it back - it is just a game, it's so sad that a minority think winning is more important than just enjoying things, or think their own enjoyment is more important than everyone else's.
As an old-timer myself, we thought it was despicable when people replaced downloaded skins in QuakeWorld with all-fullbright versions in their local client, so they could get an advantage spotting other players... I suppose that does show us that multiplayer cheating is almost as old as internet gaming.
Also, for more casual play, don't players have rankings so that you play with others about your level? Cheaters would alll end up just playing with other cheaters in that case, wouldn't they?
Making a Valve-only Linux solution would take a lot of the joy of this moment away for many. But it would also help Valve significantly. It's very uncomfortable to consider, imo.
I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.
But even then, when everyone is trying out a new indie game there’s a chance it won’t work on non-Windows. It’s happened to me.
I am very pro-Linux and pro-privacy, and hope that the situation improves so I don’t have to continue to compromise.
At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
Id Software do prefer Vulkan but they are an outlier.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
https://godotengine.org/article/dev-snapshot-godot-4-6-dev-5...
Assuming that cheats work by reading (and modifying) the memory of the game process you can you can attach a kprobe to the sys_ptrace system call. Every time any process uses it, your eBPF program triggers. You can then capture the PID and UID of the requester and compare it against a whitelist (eg only the game engine can mess with the memory of that process). If the requester is unauthorized, the eBPF program can even override the return value to deny access before the kernel finishes the request.
Of course there are other attack vectors (like spoofing PID/process name), but eBPF covers them also.
All of this to say that Linux already has sane primitives to allow that, but that, as long as devs don't prioritize Linux, we won't see this happening.
but how does the anti-cheat know that the kernel is not modified such that it disables certain eBPF programs (or misreports cheats/spoofs data etc)?
This is the problem with anti-cheat in general (and the same exists with DRM) - the machine is (supposedly) under the user's total control and therefore, unless your anti-cheat is running at the lowest level, outside of the control of the user's tampering, it is not trustworthy. This leads to TPM requirements and other anti-user measures that are dressed as pro-user in windows.
There's no such thing in linux, which makes it inoperable as one of these anti-cheat platforms imho.
(The following was refined by an LLM because I didn't remember the details of when I was pondering this a while back)
All your anti cheats are eBPF programs hooked to the bpf() syscall itself.
Whenever any process tries to call BPF_PROG_DETACH or BPF_LINK_DETACH, your monitors check if the target is one of the anti cheats in your cluster of anti-cheats.
If an unauthorized process (even Root) tries to detach any of your anti-cheat processes, the eBPF program uses bpf_override_return to send an EPERM (Permission Denied) error back to the cheat.
(End LLM part)
Of course, you can always circumvent this by modifying and compiling the kernel so that those syscalls when targeting a specific PID/process name/UID aren't triggered. But this raises the difficulty of cheating a lot as you can't simply download a script, but you need to install and boot a custom kernel.
So this would solve the random user cheating on an online match. Pro users that have enough motivation can and will cheat anyway, but that is true also on windows. Finally at top gaming events there is so much scrutiny as you need to play on stage on vetted PCs that this is a non-issue
I believe the goal is to make it so uncomfortable and painful that 99.999% of the users will say fuck it and they won't do it. In this case users need to boot a custom kernel that they download from the internet which might contain key-loggers and other nasty things. It is not just download a script and execute it.
For cheat developers, instead, this implies doing the modifications to allow those sys-calls to fly under the radar while keeping the system bootable and usable. This might not be trivial.
Sure, except that anyone can just compile a Linux kernel that doesn't allow that.
Anti-cheat systems on Windows work because Windows is hard(er) to tamper with.
This isn't complicated.
Even the Crowdstrike falcon agent has switched to bpf because it lowers the risk that a kernel driver will brick downstream like what happened with windows that one time. I recently configured a corporate single sign on to simply not work if the bpf component was disabled.
Anticheat and antivirus are two similar but different games. It's very complicated.
Although even then I'd still have qualms about paying for the creation of something that might pave the path for hardware vendors to work with authoritarian governments to restrict users to approved kernel builds. The potential harms are just not in the same league as whatever problems it might solve for gamers.
- Want to play these adversarial games
- Don't care about compromising control of hypervisor
- Don't simply have a dedicated gaming box
A hypervisor that protects against this already exists for Linux with Android's pKVM. Android properly enforces isolation between all guests.
Desktop Linux distros are way behind in terms of security compared to Android. If desktop Linux users ever want L1 DRM to work to get access to high resolution movies and such they are going to need such a hypervisor. This is not a niche use case.
I would never use a computer I don't have full control over as my main desktop, especially not to satisfy an external party's desire for control. It seems a lot more convenient to just use a separate machine.
Even mainstream consumers are getting tired of DRM crap ruining their games and movies. I doubt there is a significant Linux users would actually want to compromise their ownership of the computer just to watch movies or play games.
I do agree that Linux userland security is lackluster though. Flatpak seems to be a neat advancement, at least in regard to stopping things from basically uploading your filesystems. There is already a lot of kernel interfaces that can do this like user namespaces. I wish someone would come up with something like QubesOS, but making use of containers instead of VMs and Wayland proxies for better performance.
I honestly think you would be content as long as the computer offered the ability to host an arbitrary operating system just like has always been possible. Just because there may be an optional guest running that you can't fully control that doesn't take away from the ability to have an arbitrary guest you can fully customize.
>to satisfy an external party's desire for control.
The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
>It seems a lot more convenient to just use a separate machine.
It really isn't. It's much more convenient to launch a game on the computer you are already using than going to a separate one.
It's a little funny that the two interests of adtech are colliding a bit here: They want maximum control and data collection, but implementing control in a palatable way (like you describe) would limit their data collection abilities.
My answer to your question: No, I don't like it at all, even if I fully trust the hypervisor. It will reduce the barrier for implementing all kinds of anti-user technologies. If that were possible, it will quickly be required to interact with everything, and your arbitrary guest will soon be pretty useless, just like the "integrity" bullshit on Android. Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc. That's still a net minus compared to the status quo.
In general, I dislike any methods that try to apply an arbitrary set of criteria to entitle you to a "free" service to prevent "abuse", be it captchas, play integrity, or Altman's worldcoin. That "abuse" is just rational behavior from misaligned incentives, because non-market mechanisms like this are fundamentally flawed and there is always a large incentive to exploit it. They want to have their cake and eat it too, by eating your cake. I don't want to let them have their way.
> The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
Pretty sure we already have enough technology to fully automate many games with robotics. If there is a will, there is a way. As with everything else on the internet, everyone you don't know will be considered untrusted by default. Not the happiest outcome, but I prefer it to losing general purpose computing.
I'm talking about the entire chip. You are unable to implement a new instruction for the CPU for example. Only Intel or AMD can do so. You already don't have full control over the CPU. You only have as much control as the documentation for the computer gives you. The idea of full control is not a real thing and it is not necessary for a computer to be useful or accomplish what you want.
>and your arbitrary guest will soon be pretty useless
If software doesn't want to support insecure guests, the option is between being unable to use it, or being able to use it in a secure guest. Your entire computer will become useless without the secure guest.
>Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc.
This could be handled by also running another guest that was supported by those app developers that provide the required security requirements compared to your arbitrary one.
>That "abuse" is just rational behavior from misaligned incentives
Often these can't be fixed or would result in a poor user experience for everyone due to a few bad actors. If your answer is to just not build the app in the first place, that is not a satisfying answer. It's a net positive to be able to do things like watch movies for free on YouTube. It's beneficial for all parties. I don't think it is in anyone's best interest to not do such a thing because there isn't a proper market incentive in place stop people from ripping the movie.
>If there is a will, there is a way.
The goal of anticheat is to minimize customer frustration caused due to cheaters. It can still be successful even if it technically does not stop every possible cheat.
>general purpose computing
General purpose computing will always be possible. It just will no longer be the wild west anymore where there was no security and every program could mess with every other program. Within a program's own context it is able still do whatever it wants, you can implement a Turing machine (bar the infinite memory).
They certainly aren't perfect, but they don't seem to be hell-bent on spying on or shoving crap into my face every waking hour for the time being.
> insecure guests
"Insecure" for the program against the user. It's such a dystopian idea that I don't know what to respond with.
> required security requirements
I don't believe any external party has the right to require me to use my own property in a certain way. This ends freedom as we know it. The most immediate consequences is we'd be subject to more ads with no way to opt out, but that would just be the beginning.
> stop people from ripping the movie
This is physically impossible anyway. There's always the analog hole, recording screens, etc, and I'm sure AI denoising will close the gap in quality.
> it technically does not stop every possible cheat
The bar gets lower by the day with locally deployable AI. We'd lose all this freedom for nothing at the end of the day. If you don't want cheating, the game needs to be played in a supervised context, just like how students take exams or sports competitions have referees.
And these are my concerns with your ideal "hypervisor" provided by a benevolent party. In this world we live in, the hypervisor is provided by the same people who don't want you to have any control whatsoever, and would probably inject ads/backdoors/telemetry into your "free" guest anyway. After all, they've gotten away with worse.
Is it possible to do this in a relatively hardware-agnostic, but reliable manner? Probably not.
That way you could use an official kernel from Fedora, Ubuntu, Debian, Arch etc. A custom one wouldn't be supported but that's significantly better than blocking things universally.
I'm not aware that a TPM is capable of hiding a key without the OS being able to access/unseal it at some point. It can display a signed boot chain but what would it be signed with?
If it's not signed with a key out of the reach of the system, you can always implement a fake driver pretty easily to spoof it.
These attestation methods would probably work well enough if you pin a specific key like for a hardened anti-evil-maid setup in a colo, but I doubt it'd work if it trusts a large number of vendor keys by default.
It also means that if you do get banned for any reason (obvious cheating) they then ban the EK and you need to go source more hardware.
It's not perfect but it raises the bar significantly for cheaters to the point that they don't bother.
The idea is you implement a fake driver to sign whatever message you want and totally faking your hardware list too. As long as they are relatively similar models I doubt there's a good way to tell.
Yeah, I think there are much easier ways to cheat at this point, like robotics/special hardware, so it probably does raise the bar.
Basically TPM includes key that's also signed with manufacturer key. You can't just extract it and signature ensures that this key is "trusted". When asked, TPM will return boot chain (including bootloader or UKI hash), signed by its own key which you can present to remote party. The whole protocol is more complicated and includes challenge.
I don't really care about games, but i do care about messing up people and companies that do such heinous crimes against humanity (kernel-level anti-cheat).
I feel like this is way overstated, it's not that easy to do, and could conceptually be done on windows too via hardware simulation/virtual machines. Both would require significant investments in development to pull of
And then you have BasicallyHomeless on YouTube who is stimulating nerves and using actuators to "cheat." With the likes of the RP2040, even something like an aim-correcting mouse becomes completely cheap and trivial. There is a sweet-spot for AC and I feel like kernel-level might be a bit too far.
Modern games already employ a bunch of VM-like techniques for tamper protection.
This has effectively killed PC game piracy.
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
And for what it's worth, I'm pretty sure Valorant is the most played competitive shooter at the moment.
I firmly believe that Nvidia doesn't want the general public to ever have better hardware than what is current as people could just run their own local models and take away from the ridiculous money they're making from data centers.
In step they're now renting their gaming GPUs to players with their GeForce now package.
The market share for Nvidia of gamers is a rounding error now against ai datacenter orders. I won't hold my breath about them revisiting their established drivers for Linux.
You don’t want a vendor you have to publically shame to get them to do the right thing. And that’s MS if any single sentence has ever described them without using curse words.
https://github.com/JacKeTUs/linux-steering-wheels
Hopefully vr headset support will get better
I haven’t found a tool that can access all the extra settings of my Logitech mouse, not my Logitech speakers.
OpenRGB is amazing but I’m stuck on a version that constantly crashes; this should be fixed in the recent versions but nixpkgs doesn’t seem to have it (last I checked).
On the other hand I did manage to get SteamVR somewhat working with ALVR on the Quest 3, but performance wasn’t great or consistent at all from what I remember (RTX 3070, Wayland KDE).
That resulted in Windows 8.
More recently they've freaked out about ads, app stores, and SaSS revenue, which has resulted in lots of dark patterns in the OS.
Stock price growth is their core business because that is how large firms operate.
MS used to embrace games etc because the whole point was all PCs should run Windows. Now the plan is to get you onto a subscription to their cloud. The PC bit is largely immaterial in that model. Enterprises get the rather horrible Intune bollocks to play with but the goal is to lock everyone into subs.
I thought all of them more or less have operated under Ponzinomics ever since Jack Welch showed that that worked in the short term.
The strength of Linux and Free software in general is not in that it's completely built by unpaid labor. It's built by a lot of paid, full-time labor. But the results are shared with everyone. The strength of Free software is that it fosters and enforces cooperation of all interested parties, and provides a guarantee that defection is an unprofitable move.
This is one of the reasons you see Linux everywhere, and *BSD, rarely.
I doubt it's a large reason. I'd put more weight on eg Linus being a great project lead and he happens to work on Linux. And a lot of other historical contingencies.
This flow is basically the bread and butter for the OSS community and the only way high effort projects get done.
This is a far better user experience for Battlefield players than in Windows.
Have you ever actually attempted to play that half-assed buggy piece of shit?
The one thing I haven’t been able to get working reliably is steam remote play with the Linux machine as host. Most games work fine, others will only capture black screens.
Granted, I don't play online games, so that might change things, but for years I used to have to make a concession that "yeah Windows is better for games...", but in the last couple years that simply has not been true. Games seem to run better on Linux than Windows, and I don't have to deal with a bunch of Microsoft advertising bullshit.
Hell, even the Microsoft Xbox One controllers work perfectly fine with xpad and the SteamOS/tenfoot interface recognizes it as an Xbox pad immediately, and this is with the official Microsoft Xbox dongle.
At this point, the only valid excuses to stay on Windows, in my opinion, are online games and Microsoft Office. I don't use Office since I've been on Unixey things so long that I've more or less just gotten used to its options, but I've been wholly unable to convince my parents to change.
I love my parents, but sometimes I want to kick their ass, because they can be a bit stuck in their ways; I am the one who is expected to fix their computer every time Windows decides to brick their computer, and they act like it's weird for me to ask them to install Linux. If I'm the one who has to perform unpaid maintenance on this I don't think it's weird for me to try and get them to use an operating system that has diagnostic tools that actually work.
As far as I can tell, the diagnostic and repair tools in Windows have never worked for any human in history, and they certainly have never worked for me. I don't see why anyone puts up with it when macOS and Linux have had tools that actually work for a very long time.
I didn’t see a performance increase moving to Linux for the vast majority of titles tested. Certainly not enough to outweigh the fact that I want EVERY game to work out of the box, and to never have to think if it will or won’t. And not all of my games did, and a not insignificant number needed serious tweaking to get working right.
I troubleshoot Linux issues all day long, I’ve zero interest in ever having to do it in my recreation time.
That’s a good enough reason for me to keep my windows box around.
I use Linux and OSX for everything that isn’t games, but windows functions just fine for me as a dumb console and I don’t seem to suffer any of these extreme and constant issues HN users seem to have with it from either a performance or reliability standpoint.
EAC has the support for Linux, you just have to enable it as a developer.
I know this, I worked on games that used this. EAC was used on Stadia (which was a debian box) for the division, because the server had to detect that EAC was actually running on the client.
I feel like I bring this up all the time here but people don’t believe me for some reason.
This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.
(cue superiority complex) I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively. The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology) I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
Well said, and in the tech community that's predominantly Apple. We need to change this.
(I'm typing this on my Linux desktop right now... but also have a separate Windows PC for running the games I want to run that don't work on Linux yet. When they work, I'll be thrilled to put Linux on that machine or its successor.)
Many games refuse to run in VM, even if that VM is windows one. I bet there is a trick to bypass, but then you are at risk of being banned or can't receive support when needed.
That isn't weird. It's by design. MacOS is only designed to run on Apple hardware, and a VM, even if the host is Apple hardware isn't really Apple hardware.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
Hard to say what might be going wrong for you without more details. I would guess there's something wrong with your video driver. Maybe you have an nvidia card and the OS has installed the nouveau drivers by default? Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things. This is indeed a sore spot for Linux gaming, though to be fair graphics driver problems are not exactly unheard of on Windows either.
Personally I have a bunch of machines dedicated to gaming in my house (https://lanparty.house) which have proven to be much more stable running Linux than they were with Windows. I think this is because the particular NIC in these machines just has terrible Windows drivers, but decent Linux drivers (and I am netbooting, so network driver stability is pretty critical to the whole system).
Woah, that is extremely cool. Very nice work, sir.
Crazy—it used to be that nvidia drivers were by far the least stable parts of an install, and nouveau was a giant leap forward. Good to know their software reputation has improved somewhat
Whereas, AMD just works and is thus standard recommendation.
SteamOS is based on Arch, but customized and aimed at specific hardware configurations. It’d be interesting to know what hardware you’re using and if any of your components are not well supported.
FWIW, I’ve used Steam on Linux (mostly PopOS until this year, then Bazzite) for years and years without many problems. ISTR having to do something to make Quake III work a few years ago, but it ran fine after and I’ve recently reinstalled it and didn’t have to fuss with anything.
Granted, I don’t run a huge variety of games, but I’ve finished several or played for many hours without crashes, etc.
I've been gaming on linux exclusively for about 8 years now and have had very few issues running windows games. Sometimes the windows version, run through proton, runs better than the native port. I don't tend to be playing AAA games right after launch day, though. So it could be taste is affecting my experience.
This sounds like you are rejecting help because you have made up your mind in frustration already.
Because you are doing it wrong. If you want an OS that just works, you should use Ubuntu or Fedora. Why is SteamOS based on Arch then? Because Valve wants to tweak things in it and tinker with it themselves to get it how they like.
You don't.
So use an OS that requires less from you and that tries to just work out of the box, not one that is notorious for being something you break and tinker with constantly (Arch).
I'm not saying "you're doing it wrong", because obviously if you're having trouble then that is, if nothing else, bad UX design, but I actually am kind of curious as to what you're doing different than me. I have an extremely vanilla NixOS setup that boots into GameScope + Tenfoot and I drive everything with a gamepad and it works about as easily as a console does for me.
That probably includes anything that isn't a PC in a time-capsule from when the game originally released, so any OS/driver changes since then, and I don't think we've reached the point where we can emulate specific hardware models to plug into a VM. One of the reasons the geforce/radeon drivers (eg, the geforce "game ready" branding) are so big is that they carry a whole catalogue of quirk workarounds for when the game renderer is coded badly or to make it a better fit to hardware and lets them advertise +15% performance in a new version. Part of the work for wine/proton/dxvk is going to be replicating that instead of a blunt translation strictly to the standards.
With regards to Linux I generally just focus on hardware from brands that have historically had good Linux support, but that's just a rule of thumb, certainly not perfect.
Its still open to customizing but out of the box is very damn usable and flexible.
All three games works perfectly well on both Steam OS and on my kid's PC running CachyOS without any intervention.
There are people who make stripped-down versions of windows. Is it fair to say that because these releases exist that windows isn't "just works" either?
I opted to install Linux in a VM under Hyper-V on Windows to avoid hassles with the dual GPUs in my ThinkPad P52, but this comes with several other hassles I'd like to avoid. (Like no GPU access in Linux at all...)
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
Although it was to BSDi then, and then FreeBSD and then OpenBSD for 5 years or so. I can't remember why I switched to Debian but I've been there ever since.
I'm sat here now playing Oxygen Not Included.
Interesting, I've had to switch off from Gnome after the new release changed the choices for HiDPI fractional scaling. Now, for my display, they only support "perfect vision" and "legally blind" scaling options.
Now whether or not this feature should have remained experimental is a different debate. I personally find that similar to the fact that Gmail has labeled itself beta for many years.
So on my Framework 13, I no longer have the 150% option. I can pick 133%, double, or triple. 160% would be great, but that requires a denominator of 5, which Gnome doesn't evaluate. And you can't define your own values in monitors.xml anymore.
org/gnome/mutter/experimental-features; scale-monitor-framebuffer, xwayland-native-scaling
Have that desktop be reachable with SSH for all your CLI and sys admin needs, use sunshine/moonlight for the remote streaming and tailscale for securing and making sunshine globally available.
Latency is another problem, recently LTT video show that even as low as 5-10ms added latency can negatively impact your gaming performance, even if you don't notice. You begin to notice at around 20ms.
I've had Linux running on a variety of laptops since the noughties. I've had no more issues than with Windows. ndiswrapper was a bit shit but did work back in the day.
What issues have you had?
Updated Mesa to the latest and the kernel too.
Beyond that, Lunar Lake chips are evidently really really good. The Dell XPS line in particular shows a lot of promise for becoming a strict upgrade or sidegrade to the M2 line within a few years, assuming the haptic touchpad works as well as claimed. In the meantime, I'm sure the XPS is still great if you can live with some compromises, and it even has official Linux support.
I don’t exactly understand this setup. What’s the vm tech?
Most VM software (at least all of it that I've tried) doesn't properly emulate this. Instead, after you've moved your fingers some distance, it's translated to one discrete "tick" of a mouse scroll wheel, which causes the document to scroll a few lines.
The VM software I use is UTM, which is a frontend to QEMU or Apple Virtualization framework depending on which setting you pick when setting up the VM.
This is an understatement. It is completely impossible to even attempt to install Linux at all on an M3 or M4, and AFAIK there have been no public reports of any progress or anyone working on it. (Maybe there are people working on it, I don’t know).
Sounds like the GPU architecture changed significantly with M3. With M4 and M5, the technique for efficiently reverse-engineering drivers using a hypervisor no longer works.
Thanks, I guess I stand corrected.
> There are screenshots of an M3 machine running Linux and playing DOOM at around 31:34 here
That is encouraging! Still, there is no way for a normal to user to try to install it, unless something changed very recently.
Not working with Linux is a function of Apple, not Linux. There is a crew who have wasted the last half decade trying to make Asahi Linux, a distro to run on ARM macbooks. The result is after all that time, getting an almost reasonably working OS on old hardware, Apple released the M4 and crippled the whole effort. There's been a lot of drama around the core team who have tried to cast blame, but it's clear they are frustrated by the fact that the OEM would rather Asahi didn't exist.
I can't personally consider a laptop which can't run linux "top notch." But I gave up on macbooks around 10 years ago. You can call me biased.
Amazing that high dpi still doesn’t work. I tried to run linux on 4k in around 2016-2017 and the experience was so bad I gave up.
Mesa, the kernel drivers and Proton have all seen a lot of growth this past year combined with a bunch of garbage decisions MS has doubled down on... not to mention, enough Linux users in tech combined with Valve/Steam's efforts have made it visible enough that even normies are considering giving Linux a try.
On my laptop I use to write blog posts, that never ever gets plugged into a second screen? Sure, Wayland's great. On a computer that I expect normal people to be able to use without dumb problems? Hell no!
Unfortunately, Wayland inherently can't be like Pipewire, which instantly solved basically 90% of audio issues on Linux through its compatibility with Pulseaudio, while having (in my experience) zero drawbacks. If someone could make the equivalent of Pipewire for X11, that'd be nice. Probably far-fetched though.
Well you see, you are actually just silly for wanting this or asking for this, because it's actually just a security flaw...or something. I will not elaborate further.
I just installed hyprland yesterday and outside of having to switch back to i3 once to install what they had set for a terminal in their default config(kitty), I haven’t had to leave again.
(My customer demographic is seniors & casual users).
Loading Teams can take minutes. I'm often late to meetings waiting for the damn thing to load.
Feels like early 90s computing and that Moore's Law was an excuse for bad coding practices and pushing newer hardware so that "shit you don't care about but is 'part of the system'" can do more monitoring and have more control of 'your' computer.
It’s super annoying!
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
Also, I was basically a child and had no idea what I was doing (I still don't but that's besides the point). Things have definitely gotten better.
In the Red Hat 4.2 days, it was something that I was able to use because I was a giant nerd, but I'd never ever ever have recommended it to a normal person. By Ubuntu 12.04, 15 years later, it was good enough that I'd recommend it to someone who didn't do any gaming and didn't need to use any of the desktop apps that were still then semi-common. In 2026, it's fine for just about anyone unless you are playing particular (albeit very popular) games.
Bazzite is rough in the way that all distributions are, but I imagine Windows 11 is rougher.
In Fedora Atomic it should be foolishly easy to set up a system account, with access to specific USB devices via group, and attach a volume that can easily be written to by a non-root user inside of the container.
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
> The drivers included are just too old.
This can usually be fixed by enabling Debian Backports. In some cases, it doesn't even need fixing, because userland drivers like Mesa can be included in the runtimes provided by Steam, Flatpak, etc.
Once set up, Debian is a very low-maintenance system that respects my time, and I love it for that.
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.
AVR is reporting...
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.
For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.
I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.
On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data. The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).I would say the colors over all look better on the Blu-ray.
I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.
*Edit: Sorry Hacker News has completely changed the format of my text.
Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM
Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.
The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.
I am on Wayland and outputting via HDMI 2.1 if that helps.
EDIT: Claude explained how it determined this with drm_info, and manually verified it:
> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.
EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.
EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).
EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34
Here's what I'm getting on both monitors, with HDR enabled on Gnome 49: https://imgur.com/a/SCyyZWt
Maybe you're lucky with the Dell. But as I understand, HDR playback on Chrome is still broken.
I'm actually surprised that YouTube HDR works on your side - perhaps it's tied to the ABGR2101010 output mode being available.
That's still pretty crappy. Monitors do not say whether they support BGR input signals or not as opposed to RGB.
The GPU and monitor combination has full 10-bit HDR in Windows. But in Linux it's stuck at 8bpp due to nVidia driver not having 10-bit RGB output.
EDIT: See my sibling comment.
Here's what I'm getting on an RTX 4090 / InnoCN 27M2V and Cooler Master Tempest GP27U.
https://imgur.com/a/SCyyZWt
https://news.ycombinator.com/newsguidelines.html
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
But some of the drawbacks really aren't edge cases. Apparently there is still no way for me to have access to most creative apps (e.g. Adobe, Affinity) with GPU acceleration. It's irritating that so few Linux install processes are turnkey the way they are for Windows/Mac, with errors and caveats that cost less-than-expert users hours of experimenting and mucking with documentation.
I could go on, but it really feels like a bad time to be a casual PC user these days, because Windows is an inhospitable swamp, and Linux still has some sharp edges.
It doesn't feel real sometimes. My dotfiles are modularized, backed up in Github and versioned with UEFI rollback when I update. I might be using this for the rest of my life, now.
One just need to make sure that you use the proper _rsync_ command options to preserve hard links or files will be duplicated.
It's not advisable to switch to one of these paranoid configurations outright, but they're a great introduction to the flexibility provided by the NixOS configuration system. I'd also recommend Xe's documentation of Nix Flakes, which can be used on any UNIX-like system including macOS: https://xeiaso.net/blog/nix-flakes-1-2022-02-21/
https://grahamc.com/blog/erase-your-darlings/
https://xeiaso.net/blog/paranoid-nixos-2021-07-18/
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
Are Radeon drivers less shit?
Then again Arch is one of those distros that has the attitude that you need to be a little engaged/responsible for ongoing maintenance of your system, which is why I'm against blind "just use (distro)" recommendations unless it's very basic and low assumptions about the user.
[0] https://old.reddit.com/r/archlinux/comments/1prm8rl/archanno...
A couple of months ago I bought a second hand RX 7800 XT, and prepared myself for a painful experience, but I think it just worked. Like I got frustrated trying to find out how to download and install the driver, when I think it just came with Linux Mint already.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
I do understand the evangelism being obnoxious. I don’t advocate for people to switch if they have key use cases that ONLY windows or OS X can meet. Certainly not good to be pushy. But otherwise, people are really getting a better experience by switching to Linux.
Examples:
- An important document is sent to me in a proprietary format
- A streaming service uses a DRM service owned by a tech giant that refuses to let it work with open source projects
- A video game developer thinks making games work on Linux isn't worth getting rid of rootkit anticheat
The downside is Windows users would have to live in a world without subscription-based office suites, locked down media, and letting the CCP into your ring 0.
This is the sort of question an apolitical person would ask a liberal (I am aware liberalism had been tainted in the recent times), like why is it you people are so needy and constantly preaching about democracy?
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
Linux/x86 still is poor for battery life compared to Apple.
That’s my impression anyway.
Ubuntu seems to be slowly getting worse.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
None of these problems existed a year ago.
But you can adjust your own system. It'd be unhelpful of me to suggest to an unhappy Windows user that they should switch to another operating system, as that demands a drastic change of environment. On the other hand, you're already familiar with Linux, so the switching cost to a different Linux distribution is significantly lower. Thus I can fairly say that "Ubuntu getting worse" is less of a problem than "Windows getting worse." You have many convenient options. A Windows user has fewer.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
For certain timeperiods I have needed to switch to Fedora, or the Fedora KDE spin, to get access to more recent software if I'm using newer hardware. That has generally also been pretty stable but the constant stream of updates and short OS life are not really what I'm looking for in a desktop experience.
There are three issues that linux still has, which are across the board:
- Lack of commercial mechanical engineering software support (CAD & CAE software)
- Inability to reliably suspend or sleep for laptops
- Worse battery life on laptops
If you are using a desktop and don't care about CAD or CAE software I think it's probably a better experience overall than windows. Laptops are still more for advanced users imho but if you go with something that has good linux support from the factory (Dell XPS 13, Framework, etc.) it will be mostly frictionless. It just sucks on that one day where you install an update, close the laptop lid, put it in your backpack, and find it absolutely cooking and near 0% when you take it out.
I also have never found something that gave me the battery life I wanted with linux. I used two XPS 13's and they were the closest but still were only like 75% of what I would like. My current Framework 16 is like 50% of what I would like. That is with always going for a 1080p display but using a VPN which doesn't help battery life.
My experience with FOSS has mostly been that mature projects with any reasonable-sized userbase tend to more reliably not break things in updates than is the case for proprietary software, whether it's an OS or just some SaaS product. YMMV. However, I think probably the most potent way to avoid problems like this actually ever mattering is a combination of doing my updates manually (or at least on an opt-in basis) and being willing to go back a version if something breaks. Usually this isn't necessary for more than a week or so for well-maintained software even in the worst case. I use arch with downgrade (Which lets you go back and choose an old version of any given package) and need to actually use downgrade maybe once a year on average, less in the last 5
No, not really. A Linux desktop with a DE will always be slower and more brittle than an headless machine due to the sheer number of packages/components, but something like Arch + Plasma Shell (without the whole KDE ecosystem) should be very stable and snappy. The headaches caused by immutable distros and flatpaks are not worth it IMO, but YMMV.
Not really, no. What did you install that slowed things down?
> If yes, what actually works long-term?
Plain ordinary Ubuntu 24.04 LTS, running on an ancient Thinkpad T430 with a whopping 8GB of RAM and an SSD (which is failing, but that's not Linux's fault, it's been on its way out for about a year and I should probably stop compiling Haiku nightlies on it).
Can you give an example of which desktop apps are "dragging in daemons"?
[1] - https://en.wikipedia.org/wiki/Daemon_(computing)
If you think Gimp is terrible you'll hate something like DaVinci Resolve.
I've run Void Linux + Xmonad for many years without any such issues. I also recently installed CachyOS for my kid to game on (KDE Plasma) and it works super well.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
If using Ubuntu LTS for gaming, you might want to add a newer kernel: https://ubuntu.com/kernel/lifecycle
Linux Mint would also be a reasonable pick.
Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.
Once you use it enough to develop opinions, the huge list of options will thin itself out.
I haven't tried Bazzite because I'm not into gaming but Linux Mint is working very well for a lot of people coming from Windows. It just works and has great defaults. Windows users seem to pick it up pretty easily.
Also, Linux Mint upgrades very well. I've had a lot of success upgrading to new versions without needing to reinstall everything. Ubuntu and other distros I've tried often have failed during upgrading and I had to reinstall.
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
I've not seen anything like the current level of momentum, ever, nor this level of mainstream exposure. Gaming has changed the equation, and 2026 will be wild.
On the other hand, on the Linux side, we had the release of COSMIC, which is an extremely user-friendly desktop. KDE, Gnome, and others are all at a point where they feel polished and stable.
The level of momentum feels roughly equivalent to the era of Ubuntu coming around in the mid-2000s. We have been here before.
1. 'office' cloud services - now you just need a browser for majority of docs/sheet/slides tasks
2. gaming - while it was possible back, but it was really hit or miss with a game. Nowadays vast majority of games work on Linux out of the box.
The bloat is astounding. This is especially egregious now that RAM costs a fortune.
To be honest, I always figured we'd make it in the long run. We're a thrifty bunch, we aim to set up sustainable organizations, we're more enshittification-resistant by nature. As long as we're reliable and stick around for long enough.
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
Linux still suffer from the same fragmentation issue: Oh you want to play game, you should use distro X, oh you want an average web-browsing, working, you should use distro Y, or for programming, use Z. Of course all of them can do what other can do, but the community decided that the way it is.
Yesterday i read a reddit thread about an user sharing his issue with pop-os, and most(if not all) comments saying he is using the wrong distro. He is using latest release (not the nightly build), which is a reasonably thing to do as new user.
Not sure if Linux Mint has changed this, but i remember having to add "non-free" repo to use official Nvidia driver. Not a big deal to people who know what they are doing, but still, that is unnecessary firction.
Not the best, but works for me.
I put CachyOS on it, using Steam just run the game's installer adding it as a game to your library -- you just select which proton you want (cachyos-proton) as a dropdown in the Properties in the Steam library. that's it.
it's lightweight, arch (I ditched manjaro), runs KDE and games perfectly, cursor IDE runs great, VMS run great.
first thing I did when I got it from fedex was remove Windows and put Linux on it. I thought 'maybe I'll just bite the bullet and sign up a Microsoft cloud account to be able to access ..my desktop' and 1/4 through its install I held the power button and popped a flash drive in. just say no to windows and you'll all be happy, trust me.
the only effort it required was for me to say f this on using Lutris and just use Steam as the wrapper.
2026 is definitely the year for linux. every year is. valve heavily invested in Arch, proton, and is using Linux on their devices and honestly: Windows is spyware, and after their vibe coded jank 25H2 update that broke a ton of things and Windows 10 being EOL, I hope more people get to enjoy throwing Ventoy on a USB stick with a bunch of linux isos copied over to it and boot and play with what they love.
so I disagree, 2026 is the year for Linux, and Linux is love.
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
I've been using Linux full-time (no other OSes at all) for nearly 20 years. Went through all my university education using only Linux. It's problem free if you use it like a grandma would (don't mess with the base system) and even if you mess with it, most things are easily reversible.
That being said, I have noticed that the newfound interest in Linux seems to be a result of big tech being SO abusive towards its customers that even normies are suddenly into computing "freedom".
This is more about what you choose as your operating environment, not what your work imposes as your working environment.
Most places of work, mine included, run Microsoft services that lock them into the ecosystem incredibly tightly.
As per the article title, "if you want to feel like you actually own your PC", this is about your PC, not the one provided to you by your workplace (since it's likely owned by them).
One thing I'm worried about in my work environment is Microsoft enforcing the web versions of Office and deprecating the stand alone desktop applications. The web versions are a massive step down in terms of functionality and ease of use. Your mention of OWA makes me feel as if that is what Outlook will be sacrificed for at some point in the future anyway.
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
I've used Mint in the past, loved it until I spent a day trying to get scanner drivers to work. Don't know if that's changed now, was 4 years ago
I am using Fedora on machines with new hardware and liking it as well. It has small pluses/minuses vs Mint.
https://universal-blue.org/
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
If anybody can help me out with a better solution with a modern distribution, that's about 75% of the reason I'm posting. But it's been a major pain and all the GitHub issues I have encountered on it show a big resistance to having better behavior like is the default for MacOS, Windows, or older Linux.
Regardless, I believe EarlyOOM is pretty configurable, if you care to check it out.
If you want a distro that really cares about the desktop experience today, try Linux Mint. Windows users seem to adapt to it quite quickly and easily. It's familiar and has really good defaults that match what people expect.
Try doing less at once, or getting more memory.
If your solution is "don't ever run out of memory" my solution is "I won't ever use your OS unless forced to."
Every other OS handles this better, and my work literally requires pushing the bounds of memory on the box, whether it's 64GB or 1TB of RAM. Killing an entire cgroup is never an acceptable solution, except for the long-running servers that systemd is meant to run.
Windows is unstable even if you have more than enough memory but your swap is disabled, due to how its virtual memory works. It generally behaves much worse than others under heavy load and when various system resources are nearly exhausted.
There are several advanced and very flexible OOM killers available for Linux, you can use them if it really bothers you (honestly you're the first I've seen complaining about it). Some gaming/realtime distros are using them by default.
Of course, if it's absolutely not compatible with your work, you can just disable systemd-oomd. I'm wondering though, what sort of work are you doing where you can't tune stuff to use 95% of your 1TB of memory instead of 105% of it?
A bit hard to do now :(
Using hardware at least 6-12 months old is a good way to get better compatibility.
Generally Linux drivers only start development after the hardware is available and in the hands of devs, while Windows drivers usually get a head start before release. Brand new hardware on a LTS (long term support) distro with an older kernel is usually the worst compatibility combo.
I recently switched to using a thumb drive to transfer files to and from my phone/tablet, I became demoralized when faced with getting it all setup.
No, thank you! The "smooth, effortless [, compulsory, mandated, enforced] integration" between my Apple devices is the very worst thing about them.
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
If I remember correctly, after the Crowdstrike BSOD-all-windows-instances update last year Microsoft wanted to make some changes to their kernel driver program and these anti-cheat measures on Windows might need to find a new mechanism soon anyway. That's a long way of saying, it's plausible that even that last barrier might come down sooner rather than later.
Some interesting reads on what modern anticheats do:
https://github.com/0avx/0avx.github.io/blob/main/article-3.m...
https://github.com/0avx/0avx.github.io/blob/main/article-5.m...
https://reversing.info/posts/guardedregions/
https://game-research.github.io/ (less in detail and less IDA pseudo)
Not up close due to the vast number of inconsistencies.
This could only be fixed by a user experience built from the ground up by a single company.
I get that you're making a Windows joke, but this describes Linux equally well.
The UX leaves a lot to be desired.
Even modern macs fall short of the UX Apple has traditionally been known for...
MacOS is highly consistent compared to Windows.
Perhaps Linux operating systems like Steam or ChromeOS might finally create a beautiful and consistent UI.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
On Windows all of this just works.
> on windows all of this just works
Disagree on the sleep one - my work laptop doesn’t go to sleep properly. The only laptop I’ve ever used that behaves as expected with sleep is a macbook.
It’s more than fine for people to dislike Apple products but this is simply not an area where other platforms have them beat.
I'm also seeing results for "macbook pro doesn't go to sleep when lid closed", so other people see this problem too. You can't really claim that other platforms have them beat here if there isn't data to support the claim.
All in all, I've given up on sleep entirely and default to suspend/hibernate now.
There are valid reasons why a program might need to block sleep, so it's not like macOS is going to hard-prevent it if a program does this. Most programs should not be doing that though.
Laptop sleep and suspend can still be finicky unfortunately.
I will say my experience using CAD or other CAE software on windows has gotten progressively worse over the years to the point that FEA is more stable on linux than on windows.
We do really need a Solidworks, Creo or NX on linux though. My hope has been that eventually something like Wine, Proton, or other efforts to bring windows games to linux will result in us getting the ability to run them. They are one of the last things holding me back from fully moving away from windows.
I added a couple VMs running windows, linux, and whatever else I need in proxmox w/ xrdp/rdp and remina, and it's really the best of both worlds. I travel a good deal and being able to remotely connect and pick up where I left off while also not dealing with windows nagware has been great.
It's funny they would choose this phasing.
This is exactly the way I described my decision to abandon windoze, and switch to linux, over 20 years ago...
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
In case it helps:
https://github.com/libratbag/piper
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
Instead of distro upgrades, spend 3 minutes disabling the newest AI feature using regedit.
But, as the author rightly notes: It's more about a "feeling." Well then, good luck.
Please revert this submission to use the correct title.
Even with imperatively configured distros like Ubuntu, it's generally much easier to recover from a "screen of death" than in Windows because the former is less of a black box than the latter. This means its easier to work out what the problem is and find a fix for it. With LLMs that's now easier than ever.
And, in the worst case that you have to resort to reinstalling your system, it's far less unpleasant to do that in a Linux distro than in Windows. The modern Windows installer is painful to get through, and then you face hours or days of manually reinstalling and reconfiguring software which you can do with a small handful of commands in Linux to get back to a state that is reasonably similar to what you had before.
Incidentally, I can now honestly say I've had more driver issues with Windows than Linux.
The result was that from day 1 of using Linux I never looked back.
Fun aside: I had a hardware failure a few years ago on my old workstation where the first few sectors of every disk got erased. I had Linux up and running in 10 minutes. I just had to recreate the efi partition and regenerate a UKI after mounting my OS from a live USB. Didn't even miss a meeting I had 15 minutes later. I spent hours trying to recover my Windows install. I'm rather familiar with the (largely undocumented) Windows boot process but I just couldn't get it to boot after hours of work. I just gave up and reinstalled windows from scratch and recovered from a restic backup.
Windows has recently been a complete shitshow - so even if Linux hasn't gotten any better (it has) it is now likely better than fiddling around with unfucking Windows, and Windows doing things like deleting all your files.
That's exactly my point.
There's an ever growing list of things to do in order to fix Windows, and that list is likely longer than Linux. This whole "your time is free" argument hinges on Windows not having exactly the same issue, or worse.