Somehow, with 12GB of RAM, I can't get my iPhone 17 Pro to keep more than a few safari tabs open without having them refresh when I come back from an app or two, and it makes me want to throw my phone across the train (Where the internet often cuts out!).
A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
I also wonder what this means for smartphone-esque devices like the Switch 2. If this goes on long enough I won't be surprised if they release a 'lite' model with less RAM/Storage and bifurcate their console capabilities, worse than what they did with 3DS > 2DS .
It's really nuts how much RAM and CPU have been squandered. In, 1990, I worked on a networked graphical browser for nuclear plants. Sun workstations had 32 mb memory. We had a requirement that the infographic screens paint in less that 2 seconds. Was a challenge but doable. Crazy thing is that computers have 1000x the memory and like 10,000x the CPU and it would still be a challenge to paints screens in 2 seconds.
Yes, the web was a mistake; as a distributed document reading platform it's a decent first attempt, but as an application platform it is miserable. I'm working on a colleague's vibe-coded app right now and it's just piles and piles of code to do something fairly simple; long builds and hundreds of dependencies... most of which are because HTML is shitty, doesn't have the GUI controls that people need built in, and all of it has to be worked around as a patch after the fact. Even doing something as simple as a sortable-and-filterable table requires thousands of lines of JS when it should've just been a few extra attributes on an HTML6 <table> by now.
Back in the day with PHP things were much more understandable, it's somehow gotten objectively worse. And now, most desktop apps are their own contained browser. Somehow worse than Windows 98 .hta apps, too; where at least the system browser served a local app up, now we have ten copies of Electron running, bringing my relatively new Macbook to a crawl. Everything sucks and is way less fun than it used to be.
We have many, many examples of GUI toolkits that are extremely fast and lightweight. Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
It's not "the web" or HTML, CSS, or JavaScript. That's all instant in vanilla form. Any media in today's quality will of course take time to download but, once cached, is also instant. None of the UX "requires" the crap that makes it slow, certainly not thousands of lines to make a table sortable and filterable. I could do that in IE6 without breaking a sweat. It's way easier, and faster, now. It's just people being lazy in how they do it, Apparetnly now just accepting whatever claude gave them as "best in show".
Back in PHP days you had an incentive to care about performance, because it's your servers that are overloaded. With frontend there's no such issue, because it's not your hardware that is being loaded
I was trying to upload a 300mb video via the local police's web interface, a very important matter. I had to set my phone screen to stay on for 30 minutes and then leave the web browser open without touching it. Disabling all power saving measures makes not difference. This was the only way I could get it to finish uploading. I'm on a pixel 8 pro with grapheneos. Same thing in both Firefox and vanadium. I don't think it runs out of ram, the system is just too trigger happy. The battery still doesn't last all day anyway.
My iPhone 8 just stopped working 2 months back (phone works but the microphone used in phone calls no longer works) so by chance my good friend gave me his pixel 8 that was only a couple few months old. It got a pink line down the screen that comes and goes which if you press in one spot can usually make it go away but he is a business owner and he can't risk the screen going from line to not working for a day as a missed communication could cost him thousands. So he said here take it and he got a new one. Seems like this pink line is common and a defect in some screens.
Anyways I wanted to say I also have a pixel 8 but with stock OS and my battery typically lasts a full day with average usage. My iPhone 8 previously even with a replacement battery was lucky if it lasted more then 5 hours. I had to charge that thing multiple times a day.
> A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
Considering how many people are so averse to programming that they use LLMs to generate code for them? Not very likely IMO. I would like to see it happen, but people seem allergic to actually trying to be good at the craft these days.
From everything I’ve seen, LLMs aren’t exactly known for writing extremely optimized code.
Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?
There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.
You can also tell it the optimization to implement.
I asked Claude to find all the valid words on a Boggle board given a dictionary and it wrote a simple implementation that basically tried to search for every single word on the board. Telling it to prune the dictionary first by building a bit mask of the letters in each word and on the board and then checking if the word is even possible to have on the board gave something like a 600x speedup with just a simple prompt of what to do.
That does assume that one has an idea of how to optimize though and what are the bottlenecks.
Can we assume at this point if the problems are well known, the low hanging fruit has already been addressed? The Boggle example seems like a pretty basic optimization that anyone writing a Boggle-solver would do.
iOS is 19 years old, built on top of macOS, which is 24 years old, built on top of NeXTSTEP, which is 36 years old, built on top of BSD, which is 47 years old. We’re very far from greenfield.
They kind do if you prompt them, I had mine reimplement the Windows calc (almost fully feature complete) in rust running with 2mb RAM instead of 40mb or whatever the win 11 version uses as a POC.
A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.
I feel like my 3GS was way better about resuming where I left off than any fancy new iPhone I’ve had in the past few years.
Big name apps like Facebook, YouTube, Apple Music, Apple Podcasts seem totally disinterred in preserving my place.
YouTube being the worst where I often stack a bunch of videos in queue, pause to do something else for a while and when I return to the app the queue has been purged.
YouTube will literally resume back to exactly where I was, then seemingly noticing that I switched back to it, go ahead and close the video I was watching. With all sorts of animations too, it's not just a case of having showed a cached screenshot. YouTube seems to intentionally forget where in a video I was, often after having been paused in the background for only a minute or two.
See if turning off your ad blocker makes a difference. I've noticed that sometimes YouTube has parts of the site the apparently can look to ad blockers like they are part of an ad (maybe intentionally to annoy people with ad blockers?).
Likely some kind of complex refresh operation that kicks off when entering the foreground and takes a few seconds to complete before overwriting your state.
YouTube on TVs will often keep closed captioning on when switching accounts, then notice that CC is on and turn it off. Even though every account in the household always has CC turned on.
I feel like that's definitely a choice for Facebook at least - there's no technical reason the app couldn't remember at least the post you were looking at. I think they literally don't care if you were halfway through reading something when you flicked out of the app and go back in - refreshing the page and showing you all new stuff is probably measurably "better for 'engagement'" by whatever silly metrics they use.
Youtube/Google just make these shitty small annoying decisions just to make the iOS experience that little bit more annoying than it has to be.
Case in point — Youtube background play doesn’t pause when Siri makes an announcement, so if you’re listening to something you get two voices over each other.
I gave it the benefit of the doubt and figure it must be some kind of iOS thing, until I was listening to Audible one day and it paused automatically. So it’s just a google thing, not a third-party apps thing.
i have the same issue with the Youtube queue — this is something that could easily be persisted, but they just choose not to.
Too slow to edit. But also now playing just seems to go away after a while. Why isn’t this written to some nonvolatile place and just preserved? It feels like it must be on purpose but I wonder what the purpose is.
I assume the purpose of the Now Playing clearing after a while is the idea that when people start a "new session" with their device it should be "clean". Like, if Now Playing didn't randomly disappear then for most people it would always be on, indicating some paused music or podcast playback. It would also never give a chance for that elusive "start playing" experience that shows up in its place sometimes to recommend that I listen to one of four songs/podcast episodes.
I find myself saving a ton of stuff to my Watch Later list, because I can’t trust the Back button when using YouTube. This issue exists on the phone, web, and AppleTV. YouTube just likes to randomly refresh everything. It’s the most annoying “feature”.
Even system apps like Photos have completely given up on state restore. I'm deep in an album comparing a photo to something on the web? Sorry, Safari needs all that RAM, Photos all is kicked out, and Photos can't possibly remember you were inside an album (despite, you know, all the APIs Apple specifically has to manage this [0]). They USED to care about these things and made it seamless enough that you weren't supposed to know that the the app was killed in the background, but they just don't seem to care anymore
Now is bad too, but my recollection is that the iPhone 3G-era task killer was EXTREMELY aggressive and required "tricks" to keep your state in the one app you could run
I feel like this might be intentional to a certain degree, at least on YouTube or Facebook.
If you switched off the app while looking at a certain post or watching a certain video, that's a negative engagement indicator, so the app wants to throw you back into the algorithmic feed to show you something new instead.
Ad blockers don't work anymore, at least not with the version YT serves me. If it thinks that I have an ad blocker active (false positives happen too), it will only show a black rectangle and not even load the comments.
On a tangent how about those sweet app updates with patch notes reading bug fixes every week or so from the likes of Xiaomi and Anker weighing in at 600-700mb.
It's all gone to $hit, efficiency is gone it's just slop on top of more slop.
I had a China phone with amazing specs but it KEPT KILLING EVERYTHING.
Hardware is pretty useless if the software that drives it is useless. I don't know it probably works better in China all I know is that I went back to good old Samsung.
It's a pervasive Chinese phone problem. I've used many and they all have "Battery saving" features on by default, which means killing background apps after a while apparently. Battery life is great, but newly installed apps sometimes don't work as they should.
The market demands must be different there. I've disabled "battery optimisation" for all the apps I need to stay open (and some apps even prompt me to disable it!), and I don't have any issues in daily use.
That kind of aggressive process termination will be becoming less common since Android introduced freezer [1] optimization to put a background process to a completely unscheduled state.
iOS I think has really aggressive background task killing, and it also drives me insane. I know they do it for battery life but I'm about ready to switch to Android, and would have a long time ago if I that didn't also mean replacing my watch, headphones, etc.
Is it too much to ask for me to manage my own background processes on my phone? I don't want the OS arbitrarily deciding what to pause & kill. If it actually does OOM, give me a dialog like macOS and ask me what to kill. Then again, if a phone is going OOM with 12GB of RAM there's a serious optimization problem going on with mobile apps.
> iOS I think has really aggressive background task killing, and it also drives me insane. I know they do it for battery life but I'm about ready to switch to Android, and would have a long time ago if I that didn't also mean replacing my watch, headphones, etc.
Android does all sorts of wacky stuff with background tasks too... Although I don't feel like my 6 GB Android is low memory, so maybe there's something there, but I also don't run a lot of apps, and I regularly close Firefox tabs. Android apps do mostly seem well prepared for background shenanigans, cause they happen all the time. There's the AOSP/Google Play background app controls, but also most of the OEMs do some stuff, and sometimes it's very hard to get stuff you want to run in the background to stay running.
I dunno about watches, but Airpods work fine with Android, as long as you disconnect them from FindMy cause there's no way to make them not think they're lost (he says authoritatively, hoping to be corrected).
On Android of course it depends on the configuration. I am running LineageOS 23 on an older device with 6GB of RAM as well and it would kill basically anything (making e.g. paying with a credit card a pain when you have to switch to the bank app to confirm a transaction). Had to adjust few variables for ZRAM control and now it's seamless.
iOS doesn't have aggressive background task killing except for memory pressure. It suspends apps for battery life; it only kills them under memory constraints. If you don't want apps dying and tabs closing, use apps that use less memory. iOS does not have swap out of a desire to avoid unnecessary NAND wear (and to avoid the performance impact), so it must more aggressively kill things.
So I have safari and I can’t switch to my email? Both native apps and sometimes I lose the state of safari if I move more than 10s away? I have to keep switching between the 2 apps to keep alive my safari tab? Insanity
I recently started learning how to do iOS apps for work and the short answer is: you don't.
Apple seemingly wants all apps to be static jpegs that never need to connect to any data local or remote, and never do any processing. If you want to do something in the background so that your user can multitask, too damn bad.
You can run in the background, for a non-deterministic amount of time. If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
It's honestly insane. I don't know why or how anyone develops for this platform.
Not to mention the fact that you have to spend $5k minimum just to put hello world on the screen. I can't believe that apple gets away with forcing you to buy a goddamn Mac to complile a program.
I've never felt nagged. Every time I get one of those popups, which isn't too often, I think "neat, good to know."
It's inconvenient that apps can't do long-running operations in the background outside of a few areas, but that's a design feature of the platform. Users of iOS are choosing to give up the ability to run torrent clients or whatever in exchange for knowing that an app isn't going to destroy their battery life in the background.
> If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
These are features, because we can't trust developers to be smart about how they implement these. In fact, we can't even trust them not to be malicious about it. User nags keep the dveloper honest on a device where battery life and all-day availability is arguably of utmost importance.
> you have to spend $5k minimum just to put hello world on the screen.
I really dont understand that at all. Web Pages are mostly static, you would think the iPhone would cache websites reasonably well.
I remember on Android I dont recall the app name specifically, but it would let me download any website for offline browsing or something, would use it when I knew I might have no internet like a cruise.
Heck there used to be an iOS client for HN that was defunct after some time, but it would let you cache comments and articles for offline reading.
It's the js that does it, because so many webpages are terribly optimized to integrate aggressive ad waterfalls into them. Or have persistent SPA framework's doing continually scope checks.
That being said, there's no reason the Safari context shouldn't be able to suspend the JS and simply resume when the context is brought back to the foregrown. It's already sandboxed, just stop scheduling JS execution for that sandbox.
Web pages that make sense are mostly static. But these days articles need to load each paragraph dynamically, so in order to save 3kb in case you wouldn't finish the article you need to download 5mb of js to do that, plus a bunch of extra handshakes.
It’s not just mobile safari, safari on desktop does the same thing even with lots of memory available. Whatever they’re doing to limit a tabs resources needs to go, it’s so frustrating.
I am on my $110 android device from 2022 (4GB RAM), and I have never faced the browsing related issues that you mentioned.
My phone came with stock android 11 ROM with no bloats, so that might've helped too I guess.
You're just adding a step that doesn't fix the primary issue (you can already manually save any page you want without adding it your reading list). Someone should be able to go to their translate app, then their photo galley, and back to Safari without it needing to refresh the context.
That doesn’t save the current dynamic state of the page. It’s at most useful for static content, but even on a Wikipedia page you’ll lose your current state of expanded/collapsed sections and hence your reading position.
Wasn't the 2DS just a 3DS minus the lenticular screen, and especially minus the front-facing camera that did face tracking to improve the quality of the 3D?
My understanding was that market research showed a lot of users were turning off the 3D stuff anyway, so it seemed reasonable to offer a model at lower cost without the associated hardware.
> My understanding was that market research showed a lot of users were turning off the 3D stuff anyway
It was also because young children weren't supposed to use the 3D screen due to fears of it affecting vision development. You could always lock it out via parental controls on the original, but still that was cited as a reason for adding the 2DS to the lineup.
> Fils-Aime said. “And so with the Nintendo 3DS, we were clear to parents that, ‘hey, we recommend that your children be seven and older to utilize this device.’ So clearly that creates an opportunity for five-year-olds, six-year-olds, that first-time handheld gaming consumer."
IOS or safari issue then, I also have 12GB ram on my S25+, with 25 open tabs, and I quickly did a test, there was non that were un-loaded that I had to reload
It happened a lot on my previous phone with only 4GB ram though
Mine (Android Firefox) does it when I have a YouTube video paused and do something else for a bit. Whenever I stop watching a video, I have to screenshot it so I know the timestamp to try to get back to later :-/
App battery usage is unrestricted, so it's not that.
With dram, you have to refresh every cell within a periodic interval. Usually this is handled in hardware. It would be a crazy optimization if unused pages weren’t refreshed. There would have to be a decent amount of circuitry to decide that.
I'm not suggesting it exists, but I could plausibly see something where the range to refresh could be changed at runtime. If you could adjust refresh on your 8 GB phone in 1 GB intervals (refresh up to 1/2/4/8 GB etc; or refresh yes/no for each 1GB interval), the OS could be sure to put its memory at low addresses, and the OS could do memory compaction into lower addresses and disable refresh on higher addresses from time to time. Or ... I think there's apis for allocations for background memory vs foreground memory; if you allocate background memory at low addresses and foreground memory at high addresses, then when the OS wants to sleep, it kills the process logically and then turns off refresh on the ram ... when it wants to use it again later, it will have to zero the ram cause who knows what it'll have.
I don't work at that kind of level, so I dunno if the juice would be worth the squeeze (sleep with DRAM refresh is already very low power on phone scales), but it seems doable.
This is an argument for having less memory on a hardware level. But once the DRAM is there, it uses power, whether or not it stores useful data or useless data.
There's a reason why we say unused RAM is wasted RAM.
Powering down unused physical RAM is absolutely a thing on some systems. For one thing, it's required if you ever want to support physical memory hotplug. The real issue however is that the gain from not doing DRAM refresh is clearly negligible: it's no more than the difference between putting a computer to sleep (ACPI S3), or putting a phone to sleep in airplane mode - and powering it off.
Am I too much of an idealist to hope that AI leads to less buggy software? On the one hand, it should reduce the time of development; on the other hand, I'm worried devs will just let the agents run free w/o proper design specs.
The message with AI from execs is that you have to go fast (rush!). Quality of work drops when you rush. You forget things, don’t dwell on decisions and consequences, just go-fast-and-break-things.
> The message with AI from execs is that you have to go fast (rush!). Quality of work drops when you rush.
Sure, but otherwise, the competition will be first to market, and the exec may lose their bonus. So, the exec keeps their bonus, and when the tech debt collapses, the exec will either have departed long ago or will be let go with a golden parachute, and in the worst case an entire product line goes down the drain, if not the entire company.
The financialization and stonkmarketization of everything is killing our society.
The average LLM writes cleaner, better-factored code than the average engineer at my company. However, I worry about the volume of code leading to system-scale issues. Prior to LLMs, the social contract was that a human needs to understand changes and the system as a whole.
With that contract being eroded, I think the sloppiness of testing, validation, and even architecture in many organizations is going to be exposed.
The social contract where I work is that you’re still expected to understand and be accountable for any code you ship. If you use an LLM to generate the code, it’s yours. If someone is uncomfortable with that, then they are leaning too hard on the LLM and working outside of their skill level.
It might actually turn out like that. A lot of bloat came from efforts to minimize developer time. Instead of truly native apps a lot of stuff these days is some react shaped tower of abstractions with little regard for hardware constraints.
That trend might reverse if porting to a best practice native App becomes trivial.
Considering that AI still can't even reliably get basic programming tasks correct, it doesn't seem very likely that turning it loose will improve software quality.
Considering how many companies that have adopted AI led to disastrous bugs and larger security holes?
I wouldn't call it an idealist position as much as a fools one. Companies don't give a shit about software security or sustainable software as long as they can ship faster and pump stocks higher.
> and it makes me want to throw my phone across the train (Where the internet often cuts out!).
Spotted the German lol
The general problem is that many people don't bother testing their apps outside of their office wifi with low latency, low jitter, low packet loss and high bandwidth. Something like persisting the state when the OOM/battery-save killer comes knocking onto some cloud endpoint? Perfectly fine on wifi... but on a mobile connection that might just be EDGE, cut entirely because the user is just getting a phone call and the carrier does not do VoLTE, or be of an absurd latency? Whoops. Process killer knocks a -9 and that's it, state be gone.
Side note: Anyone know of a way to prevent the iPhone hotspot from disassociating with a MacBook when the phone loses network connectivity? It's darn annoying, I counted having to reconnect twenty times on a train ride less than an hour.
That is an Apple problem and keep in mind that iPhone doesn't do multi-task, the fact that you are having problems with 12GB is not surprised to me.
I have to use a Macbook M4 at work with 24GB, I have an AMD Lenovo Ryzen7 with 32GB running Linux Mint Cinnamon.
It is infuriating how slow this Macbook is, even to shut it down is slow asf.
macOS is not different than Windows, I cannot wait for COB to get back to my Linux laptop.
24GB is not enough, it will keep swapping, compressing etc. I had such device at work. 32GB is a night and day difference.
That said my workflows are such that I need at least 128GB now...
Removing docking functionality could possibly reduce RAM usage by never enabling 4K screen output. This would be similar to the switch lite.
Although, for a $450 device that doesn’t need to make much of a profit on its own, I also don’t think they’re heavy on memory in the first place (12GB). You can buy top quality Chinese Android handhelds with more RAM and better Qualcomm processors than the Switch 2 for about the same price, and those companies are making $0 in software royalties (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
> Removing docking functionality could possibly reduce RAM usage by never enabling 4K screen output. This would be similar to the switch lite.
Every version of the Switch 1 had 4GB of RAM, they didn't cut that on the Lite. Going back and patching every game to ensure it ran on less RAM it was originally designed for would have been a nightmare.
> (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
AYN just announced that the Thor will get a price increase soon for obvious reasons.
Oh yeah, I accidentally implied the switch lite cut down RAM when it didn’t.
Of course the Thor Max will have a price increase, but also, obviously 16GB/1TB is a massively bigger bill of materials than the Switch 2’s 12GB/256GB configuration.
And I forgot to mention that Nintendo has far more pricing leverage in terms of their volume.
I honestly think the memory shortage kills the possibility of a Switch 2 Lite.
Nintendo can't realistically take memory budget away from developers after the fact. The 2DS cut the 3D feature from the 3DS, but all games were required to be playable in 2D from day 1, so no existing games broke on the cost-reduced 2DS.
The latest phone reviews have been eyebrow raising.
The just announced pixel is the same phone as last year. I know it sounds like a usual complaint, but look at the actual specs, it literally is the same phone with differences so small that hey might have passed as regional variance.
As for the Samsung, the screen can darken when looked from the side for privacy. That’s pretty much it. Price increased though.
Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
> The latest phone reviews have been eyebrow raising.
It's eyebrow raising for me in other ways.
I have a Pixel 9a and it's been quite good with really solid battery life. It's barely 6 months old and I got it new straight from Google.
A few days ago I noticed the battery started to drain much faster than usual. I also noticed at the same time Google is pushing the 10a.
Nothing changed on my end. I barely use the phone in my day to day. In 10 hours today I sent 3 text messages with Whatsapp and lost 60% of my battery in that time frame. Up until a few days ago, 60% would last me 3 days.
I find it weirdly coincidental that the battery life went from amazing to worse than a 5 year old device I had prior to this just as they are releasing new phones. I've powered it down and given it a full discharge / charge too. It's still draining at an alarming rate.
Did it happen to them in the last few days? Did it self fix itself?
I wish there were better options for phones. It's absolutely crazy to me that a phone can be perfectly working one day and then it starts getting issues like this out of the blue.
It makes it completely undependable. All I want is a phone I can trust traveling with where I'm not going to wake up the next day and then the phone starts draining 3-4x faster than it normally does.
There hasn't even been a system update for almost 3 weeks, so it wasn't an update that busted things.
I had the conversation yesterday, unexplained battery drain as of then unresolved.
They mentioned people complaining on Reddit about battery drain since the last update, but I haven’t personally seen the threads so take it with a grain of salt.
Upcoming Apple display mounted to wall or robot arm is rumored to have audio interface and new OS without 3rd-party apps, only "AI".
Jony Ive at OpenAI is rumored to have smart speaker, pendant, pen and bone-conducting headset in the launch pipeline. Audio interfaces, no screens,
Meta is selling millions of smart glasses, with Apple and others following.
If the memory market was not distorted, home AI + agents + open models could have a bigger role via AMD Strix Halo. Instead, they will be reserved for those who can afford to spend five figures on 512GB or 1TB unified memory on Mac Studio Ultra devices.
> users [could] interact with Siri and future Apple devices without speaking out loud.. AI systems capable of interpreting facial expressions and subtle muscle movements to understand so-called “silent speech.”
Not sure. Some AI audio pendants are always on. The Apple device is rumored to adapt its interface to the user based on facial recognition. They could choose to start monitoring audio when it thinks a known human wants to interact with the device, https://news.ycombinator.com/item?id=47145201
Apple is developing a tabletop robot as the centerpiece of its artificial intelligence strategy, with plans to launch the device in 2027.. The robot resembles an iPad mounted on a movable limb that can swivel to follow users around a room..The company is also exploring other robotics concepts, including a mobile bot with wheels similar to Amazon’s Astro, and has discussed humanoid models..
I'm actually super fine with the hardware stagnating! Work on the yields, cut the prices, simplify and make it more robust, while keeping the spec the same. It gives developers something to focus on, like a console, so the software gets better over the life of the device, not worse.
Perhaps this could give room for physical design changes instead. I'm sick of phones that are just a slab of glass. I remember fun, weird, fashionable designs! Buttons and keyboards, phones felt like an individual choice, not just this boring black mirror. I'd take ten years of stagnation on hardware development in phones in exchange for ten years of exciting form factors with improving software. Let's face it: the spec is high enough for anything we need to be doing, by now. The software is the real problem, and there's room here for massive improvement.
> Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
Just "commoditizing". Last years microwave ovens were basically the same as 2024's also, and no one cares. You still need them and people still buy them and use them as much as ever, but at a replacement rate and not because of fashion or innovation.
That is a good thing. It means the economy is doing what it's supposed to do and bringing maximal value to consumers so we can spend our resources more efficiently (on other fashion-driven junk in different market segments), making us richer.
It's only bad news if your business is selling "phones" and not innovative products more generally. Which, yeah, is pretty much AAPL's trap. But that's on them, not us. We're winning.
We are and have been for many years now. Check out the "free" phone tier at your mobile vendor of choice. Those are great devices!
They may not match your particular tastes, but people with inflexible taste are always the last-resort market for manufacturers of commoditized products. People still buy from Hermès even though Shein completely dwarfs them in revenue, etc... That's the way it will always be with Apple too.
OSes have been in decline for a long time. This memory price is just a blip, though. These supply and demand shocks happen periodically and always return to normal.
I think even going back a few generations, phones are improving at a much slower pace. You can only jam so many cameras onto a phone frame before users lose interest. A few years back there was a mad dash to add AR features to flagship phones so they could wow us with apps that never materialized. My last few upgrades have been almost imperceptible. Buyers just don't have a good reason to buy new phones every two years.
Over investment in AI data centers is having a huge negative impact all over the economy. Other sectors are missing out on investment limiting their growth and stalling the economy.
Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Many hardware launches are going to be delayed or not meet expectations which really is the tip of the iceberg.
The US/SK memory cartel understandably sold out for a massive short term windfall but they their long term decisions to limit supply have created a huge opportunity for China. I wouldn't be surprised if this will go down in the history books as the start of the exit for US/SK from the industry and the start of Chinese dominance.
The smart phone industry is likely to respond with an increasingly hostile anti-consumer approach as they try and lock customers into the cabins of the sinking ship. I expect cheap and cheerful Chinese budget phones aren't going anywhere.
I am happy for ram, cpu and storage to stall. I want a more robust and open phone which can take a fall and be updated long after the vendor loses interest. I expect to uninstall most of my apps rather than install new ones as I increasingly disconnect from an ever more distracting and worthless medium. I have cancelled nearly every subscription service in the last 12 months. And I have been deleting a lot of free accounts and apps. Its like doing a big cleanup. Surprisingly rewarding.
HN has felt like more than 50% AI industry promoting blog spam of little interest to me as a reader for some time. I am setting a budget of ten, no make it five, more posts here. Then I am out for good. Account deletion and no looking back.
> Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Companies have reduced staff because of the impact of tariffs, because of low consumer confidence and spending, or as a ploy to pump share prices. Then they claim it’s AI, because it sounds a lot better to say that you’re reducing headcount because of AI than it does to admit that you’re cutting costs because of falling revenue.
I agree with you on the AI blogspam. This is a lot like the dot-com era, where a profusion of capital is causing people to develop complete horseshit products nobody needs. When the shine comes off, a lot of companies will fade, but many will stick around, and become the FAANGs of the 2030s.
In some ways it's pretty interesting to watch the entire world mobilize production for AI; some folks like to call this "hyperstition" as the future AGI reaches backwards in time to compel its own creation. Wild, but when trillions of dollars - i.e. millions of people's entire life output of work - are being put into something, it's truly an effort on a scale that no societal project has ever been before. There's no leader, nobody is in control, nobody has the grand vision other than "build the thing and get rich in the process". Amazing times to live in. The best use of our time and resources and coordination? Probably not... as we look around our broken cities, stepping over our poor and hopeless...
The U.S. gov't is now committing a sizeable chunk of GDP to investments and subsidies to AI companies and data centers and has reduced overall investment in wind and solar.
Brutally cold capitalist take. Go walk around your city, friend; remember the tragedy of the commons. There is a lot that needs to be done that isn't being done, because we're soaking up people's life's work on this effort that we don't even know the end goal of. It could result in some awful outcomes for everyone if not guided correctly, and it seems like it's not being guided at all - or worse, it's being guided by the Department of War.
The DRAM shortage and lack of fab capacity have also caused the Playstation 6 to slip to 2029 or so.[1] Game consoles are vulnerable. They need a lot of RAM and have to sell at a moderate price.
The IDC article says that DRAM prices are not expected to come down again. "While memory prices are projected to stabilize by mid-2027, they are unlikely to return to previous level — making the sub-$100 segment (171 million devices) permanently uneconomical." Before, they always came back down in the next RAM glut, when everybody built too much capacity. Why is that not going to happen next time?
You’re asking why a market that has had 3 price fixing lawsuits in less than 2 decades (criminal convictions in 1998, civil in 2006 and 2018) isn’t going to follow market dynamics?
One reason we end up with excess capacity is process improvements; adding new fabs to get more density or performance doesn't make old fabs go away, and so we go through cycles of excess capacity. Demand has been relatively constant.
Here we're facing different forces-- unprecedented demand for DRAM that may be durable. But it also looks like the pace of supply changes may be decreased as process improvements get smaller and the industry stops moving so much in lockstep.
It still matters what happens to the demand function, though. If enough AI startups blow up that there's a lot of secondhand SDRAM in the market, and demand for new SDRAM is impacted, too, that will push things down.
Sort of like what happened with the glut of telecom equipment after
Because this shortage isn't natural, it's the result of OpenAI flexing monopsony power to deprive everyone else for its strategic gain. Unlike an organic shortage, there is no compelling reason for otherwise excess capacity to be built, since this artificial shortage can end as arbitrarily as it started.
The datacenters are still going to be built, and their usage won't suddenly fall just because the companies behind some of the products on them suddenly lose value. The demand is not tied to their profits, so I find it unlikely for the shortage to just end.
There far too many railways, amusement parks, housing developments and other bubble ventures that were either never even completed after wasting a lot of money or went bust soon after opening.
No reason the same can't happen now - especially for something as expensive and faily easily re-sellable as a datacenter & the hardware insite. Just rip it all out and sell it for parts where they are actually needed.
The data centers have already been financed, they’re not going to stop halfway through because they’ve run out of money. Whether or not they’ll make money on completion is a different story, but that’s 2-3 years away at least. Then you might see RAM prices drop, but not before.
These data center projects are losing hundreds of billions of dollars which they don't have, and some evidence is starring to come out they're just money laundering schemes to get money from the government to contractors. I wouldn't bet on them all being built.
> The IDC article says that DRAM prices are not expected to come down again
Sure thing. I'd take a look at IDC & similar firms' forecasting history before worrying too much about what they say.
There is an AI boom right now. There will be a consolidation cycle at some point. When that happens half the players, if not more, will disappear. The huge hardware budgets will go with them.
We also can't be certain that the DRAM makers aren't capitalizing on this opportunity because they can. Remember: all of them are convicted monopolists. As in actual prison time convicted. And fined. And lost civil lawsuits. Multiple times.
I just can't see AI paying enough of a premium on HBM to justify the DRAM spikes. Frankly I can't see the volume either. Wafer starts on DRAM are dramatically bigger than you are probably imagining. DRAM is in practically everything these days. AI servers is but a drop in the bucket. 10% of the market? Yeah right, if its 4% I'd be shocked. And you are telling me a shift of 4% of wafers to HBM is driving these prices and shortages?
I humbly suggest if you look at the numbers something smells funny.
Disclaimer: none of us has access to the actual data, a lot of it is inferred by industry players. Some are well connected and usually accurate but that is not evidence. Therefore it is possible this is a genuine market action and nothing nefarious is going on.
HBM is not normal memory. It uses a lot more area per bit and has lower yield too. So a Gb of baseline DRAM and a Gb of HBM are very different measurements, the latter equates to so much more in terms of volume.
13%!!! This should be a code red level event for … the world? I … don’t understand how world leaders are just standing by? Smartphone growth/adoption has been the bedrock of a LOT of economic growth. I would have expected massive Government intervention to avoid this.
Where are the China hawks? The argument for protecting Taiwan was that without their chips the smartphone market would contract, right? Thats whats happening now?!
I recently upgraded from the Pixel 7 to the 10. Nothing but regret - the phone isn't worse, but it's not better either, and I had to reinstall everything. Why did I do this?
The cool thing about Pixels is that not only will you have to pay extra for RAM because of AI, but some of the RAM you paid for will also be permanently reserved for local AI features, regardless of whether you use them.
On a Pixel phone you have only Google spyware. On another brand's phone you have all the same Google spyware, plus the spyware from that brand and a permanently locked bootloader.
You can remove 3rd party spyware/bloat in 15 minutes with Shizuka/canto and a usb cable and you won't notice anything changed in the phone. Unfortunately the Google spyware is so deeply integrated that you can't really do that unless you accept a ton of things not working - not just Google apps but also lots of third party apps that require Play services.
Yes, if you want full degoogling you need a custom ROM like Graphene on Pixels or Lineage. The main issue these days is that bootloaders are locked. Phone manufacturers mostly refuse to give you control over your own hardware.
Meanwhile Apple iPhone sales were up 23% YoY end of last year. It'll likely be a good year for Apple, with a little more room in margin to make some plays, and a lottt of cash.
> By contrast, Apple and Samsung are better positioned to navigate this crisis. As smaller and low-end-positioned Android vendors struggle with rising costs, Apple and Samsung could not only weather the storm but potentially expand market share as the competitive landscape tightens.”
Dropped my iPhone couple of days ago so I had to go back to an old phone. Pixel 3a. Opens Signal and HomeAssistant faster than my 2022 iPhone ever did so why would I even buy a new phone and go back at this point. The best phones (prive/value) have already been built and sold.
Also Python generators for the lulz. They help one to write extremely memory-efficient programs. Perhaps the memory shortage further helps cement Python in the language popularity charts, vis-à-vis languages that tend to load whole data in memory by default, like R.
If we are talking about R, a lot of people who converted from R continued to operate in the same manner, by loading entire datasets into memory with pandas and numpy.
> Be honest, who had "Sam Altman kills Apple Computer" on their 2025/6 Bingo card?
Not the person Sam Altman specifically, but AI in general. It was obvious even in 2024 that braindead beancounters were jumping on the hype train, so much so that coal power plants were kept alive to satiate the power hunger [1]. The last time that shit happened, it was the coin craze [2], but unlike cryptocurrencies there was and is an actual product being made...
> Well, actually, there is one other important reason for this article’s existence I'll tack onto the end – a hope that other people start digging into what’s going on at OpenAI. I mean seriously – do we even have a single reliable audit of their financials to back up them outrageously spending this much money…for this? Heck, I’ve even heard from numerous sources that OpenAI is “buying up the manufacturing equipment as well” – and without mountains of concrete proof, and/or more input from additional sources on what that really means…I don’t feel I can touch that hot potato without getting burned…but I hope someone else will…
And I'd say if it ends up being shown there even is the slightest hint of impropriety going on, trial him. Up to and including capital punishment for the entire board and C level - what OpenAI already has done, even if legally on paper, IMHO is the biggest market manipulation in history, and it's not just one competitor that is suffering but society as a whole.
I don't have an issue with big companies and their super rich investors engaging in petty bitch fights. By all means, hand me some popcorn and soda. But the RAM situation, with everyone not being super rich and flush with cash from AI crazed investors being screwed royally? That is far beyond acceptable.
We need to send a message: you can't mess around with the world economy at that level without feeling serious repercussions. The lives of the billions are not playthings for the select few.
And if it turns out to be outright market manipulation, engaging in deals he doesn't even have the money committed for by others, much less actually have it on his balance sheet? Then it's time for the pitchforks, not even Madoff was this ruthless.
> I wonder whether we’ll see a secondary effect in the resale market.
I'm paying more on ebay for thinkcentre tiny and thinkpads - 12th gen intel and newer.
Refurbished spinny drives have been steadily climbing - up 50% since late last year. That's on top of the 20% mystery jump that happened in the last week of 2024.
We already are. Check eBay at the component level, which is showing it quite clearly. Look for secondary/reclaimed/refurbished components to backfill the gaps too.
Also be aware that this stuff whipsaws, if OpenAI actually takes posession of that memory and decides they can't use it and dumps, we're going to see a crash. Likewise if they back out of the deals with the memory fabs (or fail and default). There's some scary volatility on the horizon.
Wait until we find out that all of tech (ever) has been subsidized by the true-so-far assumption of continued growth, allowing today’s costs to be paid for by tomorrow’s larger market.
It's that everything has become 20% more expensive in the past year, I'm being taxed to death, fighting with companies trying to money grab me, my electric bill is now $800, and I'm now too broke to buy a new phone every 2 years when most of my income gets eaten by the "system".
I'll wait until either SPY does another 50% run or BTC does another 100% run and then I'll buy a new phone. Google, you want me to buy your new phone? Do something to make SPY or BTC go up and then we'll talk. Until then my current phone works, and the new features aren't a must-have.
Yeah, so I'm not buying unnecessary crap this year.
If the "system" wants to drive more consumption, it's on the "system" to put more buying power into my hands. Double my salary, reduce my taxes, make BTC do a big run up, something. Otherwise I'm happy staying put.
I doubt. Microsoft would much rather sell you a thin client & a Windows 365 subscription, and Nvidia wants you to use GeForce now instead of buying a GPU.
The shortage is manufactured, I have my doubts it will "end" in a conventional sense. I'm more skeptical and feel like this is yet another consolidation of wealth and a means of taking away compute power from people, which prevents startup competition. This way the hyperscalers are the only ones that can offer any meaningful compute.
slop. also stupid. bad llm. phone values drop 60%+ per year. if you re-parse this without spending all your time on em-dashes you will note that it slows diffusion of next-gen chips, it does not cut off those users.
A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.
I also wonder what this means for smartphone-esque devices like the Switch 2. If this goes on long enough I won't be surprised if they release a 'lite' model with less RAM/Storage and bifurcate their console capabilities, worse than what they did with 3DS > 2DS .
Back in the day with PHP things were much more understandable, it's somehow gotten objectively worse. And now, most desktop apps are their own contained browser. Somehow worse than Windows 98 .hta apps, too; where at least the system browser served a local app up, now we have ten copies of Electron running, bringing my relatively new Macbook to a crawl. Everything sucks and is way less fun than it used to be.
We have many, many examples of GUI toolkits that are extremely fast and lightweight. Isn't it time to throw the browser away, stop abusing HTML to make applications, and design something fit for purpose?
It's not "the web" or HTML, CSS, or JavaScript. That's all instant in vanilla form. Any media in today's quality will of course take time to download but, once cached, is also instant. None of the UX "requires" the crap that makes it slow, certainly not thousands of lines to make a table sortable and filterable. I could do that in IE6 without breaking a sweat. It's way easier, and faster, now. It's just people being lazy in how they do it, Apparetnly now just accepting whatever claude gave them as "best in show".
https://en.wikipedia.org/wiki/Wirth%27s_law
Anyways I wanted to say I also have a pixel 8 but with stock OS and my battery typically lasts a full day with average usage. My iPhone 8 previously even with a replacement battery was lucky if it lasted more then 5 hours. I had to charge that thing multiple times a day.
Considering how many people are so averse to programming that they use LLMs to generate code for them? Not very likely IMO. I would like to see it happen, but people seem allergic to actually trying to be good at the craft these days.
Imagine you are Apple and can just set an LLM loose on the codebase for a weekend with the task to reduce RAM usage of every component by 50%...
Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?
There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.
If you direct it to do a specific task to find memory and cpu optimization points, based on perf metrics, then it’s a completely different world.
I asked Claude to find all the valid words on a Boggle board given a dictionary and it wrote a simple implementation that basically tried to search for every single word on the board. Telling it to prune the dictionary first by building a bit mask of the letters in each word and on the board and then checking if the word is even possible to have on the board gave something like a 600x speedup with just a simple prompt of what to do.
That does assume that one has an idea of how to optimize though and what are the bottlenecks.
iOS is 19 years old, built on top of macOS, which is 24 years old, built on top of NeXTSTEP, which is 36 years old, built on top of BSD, which is 47 years old. We’re very far from greenfield.
They are trained on everything, and as a result write code like the Internet average developer.
A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.
Big name apps like Facebook, YouTube, Apple Music, Apple Podcasts seem totally disinterred in preserving my place.
YouTube being the worst where I often stack a bunch of videos in queue, pause to do something else for a while and when I return to the app the queue has been purged.
Why??
Case in point — Youtube background play doesn’t pause when Siri makes an announcement, so if you’re listening to something you get two voices over each other.
I gave it the benefit of the doubt and figure it must be some kind of iOS thing, until I was listening to Audible one day and it paused automatically. So it’s just a google thing, not a third-party apps thing.
i have the same issue with the Youtube queue — this is something that could easily be persisted, but they just choose not to.
[0] https://developer.apple.com/documentation/SwiftUI/restoring-...
If you switched off the app while looking at a certain post or watching a certain video, that's a negative engagement indicator, so the app wants to throw you back into the algorithmic feed to show you something new instead.
It's all gone to $hit, efficiency is gone it's just slop on top of more slop.
Hardware is pretty useless if the software that drives it is useless. I don't know it probably works better in China all I know is that I went back to good old Samsung.
The market demands must be different there. I've disabled "battery optimisation" for all the apps I need to stay open (and some apps even prompt me to disable it!), and I don't have any issues in daily use.
[1] https://source.android.com/docs/core/perf/cached-apps-freeze...
That's social engineering to get themselves more background network activity. I wouldn't trust such an app.
Is it too much to ask for me to manage my own background processes on my phone? I don't want the OS arbitrarily deciding what to pause & kill. If it actually does OOM, give me a dialog like macOS and ask me what to kill. Then again, if a phone is going OOM with 12GB of RAM there's a serious optimization problem going on with mobile apps.
Android does all sorts of wacky stuff with background tasks too... Although I don't feel like my 6 GB Android is low memory, so maybe there's something there, but I also don't run a lot of apps, and I regularly close Firefox tabs. Android apps do mostly seem well prepared for background shenanigans, cause they happen all the time. There's the AOSP/Google Play background app controls, but also most of the OEMs do some stuff, and sometimes it's very hard to get stuff you want to run in the background to stay running.
I dunno about watches, but Airpods work fine with Android, as long as you disconnect them from FindMy cause there's no way to make them not think they're lost (he says authoritatively, hoping to be corrected).
Apple seemingly wants all apps to be static jpegs that never need to connect to any data local or remote, and never do any processing. If you want to do something in the background so that your user can multitask, too damn bad.
You can run in the background, for a non-deterministic amount of time. If you do that, iOS nags your user to make it stop. If you access radios, iOS nags your user to disable it.
It's honestly insane. I don't know why or how anyone develops for this platform.
Not to mention the fact that you have to spend $5k minimum just to put hello world on the screen. I can't believe that apple gets away with forcing you to buy a goddamn Mac to complile a program.
People develop for iOS because iOS users spend more money. End of story.
It's inconvenient that apps can't do long-running operations in the background outside of a few areas, but that's a design feature of the platform. Users of iOS are choosing to give up the ability to run torrent clients or whatever in exchange for knowing that an app isn't going to destroy their battery life in the background.
These are features, because we can't trust developers to be smart about how they implement these. In fact, we can't even trust them not to be malicious about it. User nags keep the dveloper honest on a device where battery life and all-day availability is arguably of utmost importance.
> you have to spend $5k minimum just to put hello world on the screen.
Now that's just nonsense.
I remember on Android I dont recall the app name specifically, but it would let me download any website for offline browsing or something, would use it when I knew I might have no internet like a cruise.
Heck there used to be an iOS client for HN that was defunct after some time, but it would let you cache comments and articles for offline reading.
That being said, there's no reason the Safari context shouldn't be able to suspend the JS and simply resume when the context is brought back to the foregrown. It's already sandboxed, just stop scheduling JS execution for that sandbox.
Safari suspends backgrounded tabs. I think that's what we're observing here rather than strictly memory pressure.
“Save webpages to read later in Safari on iPhone” https://support.apple.com/guide/iphone/save-pages-to-a-readi...
My understanding was that market research showed a lot of users were turning off the 3D stuff anyway, so it seemed reasonable to offer a model at lower cost without the associated hardware.
It was also because young children weren't supposed to use the 3D screen due to fears of it affecting vision development. You could always lock it out via parental controls on the original, but still that was cited as a reason for adding the 2DS to the lineup.
https://www.ign.com/articles/2013/08/28/nintendo-announces-2...
> Fils-Aime said. “And so with the Nintendo 3DS, we were clear to parents that, ‘hey, we recommend that your children be seven and older to utilize this device.’ So clearly that creates an opportunity for five-year-olds, six-year-olds, that first-time handheld gaming consumer."
It happened a lot on my previous phone with only 4GB ram though
App battery usage is unrestricted, so it's not that.
There is a strong argument modern mobile goes too far for this.
I don't work at that kind of level, so I dunno if the juice would be worth the squeeze (sleep with DRAM refresh is already very low power on phone scales), but it seems doable.
There's a reason why we say unused RAM is wasted RAM.
Sure, but otherwise, the competition will be first to market, and the exec may lose their bonus. So, the exec keeps their bonus, and when the tech debt collapses, the exec will either have departed long ago or will be let go with a golden parachute, and in the worst case an entire product line goes down the drain, if not the entire company.
The financialization and stonkmarketization of everything is killing our society.
With that contract being eroded, I think the sloppiness of testing, validation, and even architecture in many organizations is going to be exposed.
That trend might reverse if porting to a best practice native App becomes trivial.
I wouldn't call it an idealist position as much as a fools one. Companies don't give a shit about software security or sustainable software as long as they can ship faster and pump stocks higher.
Spotted the German lol
The general problem is that many people don't bother testing their apps outside of their office wifi with low latency, low jitter, low packet loss and high bandwidth. Something like persisting the state when the OOM/battery-save killer comes knocking onto some cloud endpoint? Perfectly fine on wifi... but on a mobile connection that might just be EDGE, cut entirely because the user is just getting a phone call and the carrier does not do VoLTE, or be of an absurd latency? Whoops. Process killer knocks a -9 and that's it, state be gone.
Side note: Anyone know of a way to prevent the iPhone hotspot from disassociating with a MacBook when the phone loses network connectivity? It's darn annoying, I counted having to reconnect twenty times on a train ride less than an hour.
I have to use a Macbook M4 at work with 24GB, I have an AMD Lenovo Ryzen7 with 32GB running Linux Mint Cinnamon. It is infuriating how slow this Macbook is, even to shut it down is slow asf.
macOS is not different than Windows, I cannot wait for COB to get back to my Linux laptop.
Companies install so many invasive shit in the name of security theater and employee control that there is lots of waste going on.
Although, for a $450 device that doesn’t need to make much of a profit on its own, I also don’t think they’re heavy on memory in the first place (12GB). You can buy top quality Chinese Android handhelds with more RAM and better Qualcomm processors than the Switch 2 for about the same price, and those companies are making $0 in software royalties (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
Every version of the Switch 1 had 4GB of RAM, they didn't cut that on the Lite. Going back and patching every game to ensure it ran on less RAM it was originally designed for would have been a nightmare.
> (e.g., AYN Thor Max is $450 with a 16GB/1TB configuration).
AYN just announced that the Thor will get a price increase soon for obvious reasons.
https://www.reddit.com/r/SBCGaming/comments/1rf5gxq/to_thor_...
Of course the Thor Max will have a price increase, but also, obviously 16GB/1TB is a massively bigger bill of materials than the Switch 2’s 12GB/256GB configuration.
And I forgot to mention that Nintendo has far more pricing leverage in terms of their volume.
Nintendo can't realistically take memory budget away from developers after the fact. The 2DS cut the 3D feature from the 3DS, but all games were required to be playable in 2D from day 1, so no existing games broke on the cost-reduced 2DS.
The just announced pixel is the same phone as last year. I know it sounds like a usual complaint, but look at the actual specs, it literally is the same phone with differences so small that hey might have passed as regional variance.
As for the Samsung, the screen can darken when looked from the side for privacy. That’s pretty much it. Price increased though.
Coupled with the current iOS situation it seems like things are… rotting. Everything in decline.
It's eyebrow raising for me in other ways.
I have a Pixel 9a and it's been quite good with really solid battery life. It's barely 6 months old and I got it new straight from Google.
A few days ago I noticed the battery started to drain much faster than usual. I also noticed at the same time Google is pushing the 10a.
Nothing changed on my end. I barely use the phone in my day to day. In 10 hours today I sent 3 text messages with Whatsapp and lost 60% of my battery in that time frame. Up until a few days ago, 60% would last me 3 days.
I find it weirdly coincidental that the battery life went from amazing to worse than a 5 year old device I had prior to this just as they are releasing new phones. I've powered it down and given it a full discharge / charge too. It's still draining at an alarming rate.
I wish there were better options for phones. It's absolutely crazy to me that a phone can be perfectly working one day and then it starts getting issues like this out of the blue.
It makes it completely undependable. All I want is a phone I can trust traveling with where I'm not going to wake up the next day and then the phone starts draining 3-4x faster than it normally does.
There hasn't even been a system update for almost 3 weeks, so it wasn't an update that busted things.
They mentioned people complaining on Reddit about battery drain since the last update, but I haven’t personally seen the threads so take it with a grain of salt.
Otherwise I'd still be rocking my S9.
I'm also using a pixel 2 for Android development and Google play billing isn't supported on it.
The hardware is fine but they make it obsolete with software.
I'm guessing they'll soon move to a subscription pricing for phones.
It might last until 4G is turned off.
I can’t really imagine needing greater bandwidth than I have now but I still use the phone like it’s 2010.
Jony Ive at OpenAI is rumored to have smart speaker, pendant, pen and bone-conducting headset in the launch pipeline. Audio interfaces, no screens,
Meta is selling millions of smart glasses, with Apple and others following.
If the memory market was not distorted, home AI + agents + open models could have a bigger role via AMD Strix Halo. Instead, they will be reserved for those who can afford to spend five figures on 512GB or 1TB unified memory on Mac Studio Ultra devices.
> users [could] interact with Siri and future Apple devices without speaking out loud.. AI systems capable of interpreting facial expressions and subtle muscle movements to understand so-called “silent speech.”
https://shokz.com/pages/openrunpro2
So we are talking about a HomePod with a screen, or like one of those Meta "Portal" things?
Hmmm, so they traded always-on audio recording for always-on video recording. Not sure this is an improvement.
Perhaps this could give room for physical design changes instead. I'm sick of phones that are just a slab of glass. I remember fun, weird, fashionable designs! Buttons and keyboards, phones felt like an individual choice, not just this boring black mirror. I'd take ten years of stagnation on hardware development in phones in exchange for ten years of exciting form factors with improving software. Let's face it: the spec is high enough for anything we need to be doing, by now. The software is the real problem, and there's room here for massive improvement.
Just "commoditizing". Last years microwave ovens were basically the same as 2024's also, and no one cares. You still need them and people still buy them and use them as much as ever, but at a replacement rate and not because of fashion or innovation.
That is a good thing. It means the economy is doing what it's supposed to do and bringing maximal value to consumers so we can spend our resources more efficiently (on other fashion-driven junk in different market segments), making us richer.
It's only bad news if your business is selling "phones" and not innovative products more generally. Which, yeah, is pretty much AAPL's trap. But that's on them, not us. We're winning.
Richer… so we can buy more stuff I guess?
Anyway a good microwave can last you 30 years not even joking.
They may not match your particular tastes, but people with inflexible taste are always the last-resort market for manufacturers of commoditized products. People still buy from Hermès even though Shein completely dwarfs them in revenue, etc... That's the way it will always be with Apple too.
Companies have reduced staff prematurely on the promise of productivity improvements that have not occurred and lost customers to terrible customer service and declining product quality.
Many hardware launches are going to be delayed or not meet expectations which really is the tip of the iceberg.
The US/SK memory cartel understandably sold out for a massive short term windfall but they their long term decisions to limit supply have created a huge opportunity for China. I wouldn't be surprised if this will go down in the history books as the start of the exit for US/SK from the industry and the start of Chinese dominance.
The smart phone industry is likely to respond with an increasingly hostile anti-consumer approach as they try and lock customers into the cabins of the sinking ship. I expect cheap and cheerful Chinese budget phones aren't going anywhere.
I am happy for ram, cpu and storage to stall. I want a more robust and open phone which can take a fall and be updated long after the vendor loses interest. I expect to uninstall most of my apps rather than install new ones as I increasingly disconnect from an ever more distracting and worthless medium. I have cancelled nearly every subscription service in the last 12 months. And I have been deleting a lot of free accounts and apps. Its like doing a big cleanup. Surprisingly rewarding.
HN has felt like more than 50% AI industry promoting blog spam of little interest to me as a reader for some time. I am setting a budget of ten, no make it five, more posts here. Then I am out for good. Account deletion and no looking back.
Companies have reduced staff because of the impact of tariffs, because of low consumer confidence and spending, or as a ploy to pump share prices. Then they claim it’s AI, because it sounds a lot better to say that you’re reducing headcount because of AI than it does to admit that you’re cutting costs because of falling revenue.
In some ways it's pretty interesting to watch the entire world mobilize production for AI; some folks like to call this "hyperstition" as the future AGI reaches backwards in time to compel its own creation. Wild, but when trillions of dollars - i.e. millions of people's entire life output of work - are being put into something, it's truly an effort on a scale that no societal project has ever been before. There's no leader, nobody is in control, nobody has the grand vision other than "build the thing and get rich in the process". Amazing times to live in. The best use of our time and resources and coordination? Probably not... as we look around our broken cities, stepping over our poor and hopeless...
Would love to know what sectors you would say are obviously under invested. Sounds like an opportunity.
The IDC article says that DRAM prices are not expected to come down again. "While memory prices are projected to stabilize by mid-2027, they are unlikely to return to previous level — making the sub-$100 segment (171 million devices) permanently uneconomical." Before, they always came back down in the next RAM glut, when everybody built too much capacity. Why is that not going to happen next time?
[1] https://www.heise.de/en/news/Storage-crisis-Playstation-6-co...
Here we're facing different forces-- unprecedented demand for DRAM that may be durable. But it also looks like the pace of supply changes may be decreased as process improvements get smaller and the industry stops moving so much in lockstep.
It still matters what happens to the demand function, though. If enough AI startups blow up that there's a lot of secondhand SDRAM in the market, and demand for new SDRAM is impacted, too, that will push things down.
Sort of like what happened with the glut of telecom equipment after
Because this shortage isn't natural, it's the result of OpenAI flexing monopsony power to deprive everyone else for its strategic gain. Unlike an organic shortage, there is no compelling reason for otherwise excess capacity to be built, since this artificial shortage can end as arbitrarily as it started.
No reason the same can't happen now - especially for something as expensive and faily easily re-sellable as a datacenter & the hardware insite. Just rip it all out and sell it for parts where they are actually needed.
https://www.tomshardware.com/tech-industry/shareholders-sue-...
Sure thing. I'd take a look at IDC & similar firms' forecasting history before worrying too much about what they say.
There is an AI boom right now. There will be a consolidation cycle at some point. When that happens half the players, if not more, will disappear. The huge hardware budgets will go with them.
We also can't be certain that the DRAM makers aren't capitalizing on this opportunity because they can. Remember: all of them are convicted monopolists. As in actual prison time convicted. And fined. And lost civil lawsuits. Multiple times.
I just can't see AI paying enough of a premium on HBM to justify the DRAM spikes. Frankly I can't see the volume either. Wafer starts on DRAM are dramatically bigger than you are probably imagining. DRAM is in practically everything these days. AI servers is but a drop in the bucket. 10% of the market? Yeah right, if its 4% I'd be shocked. And you are telling me a shift of 4% of wafers to HBM is driving these prices and shortages?
I humbly suggest if you look at the numbers something smells funny.
Disclaimer: none of us has access to the actual data, a lot of it is inferred by industry players. Some are well connected and usually accurate but that is not evidence. Therefore it is possible this is a genuine market action and nothing nefarious is going on.
Where are the China hawks? The argument for protecting Taiwan was that without their chips the smartphone market would contract, right? Thats whats happening now?!
I know I'm not speaking to all the people that need to hear it, but used phones are very affordable, and reduce waste. A used iphone 13 is about $200 in the US: https://swappa.com/listings/apple-iphone-13?sort=price_low
https://www.androidpolice.com/google-pixel-10-3-5-gb-ai-only...
These techniques seem not to be widely known. A kagi search turned up only information about some singer.
https://github.com/RikkaApps/Shizuku
And canto not canta (search the play store).
My apologies, I got both last letters wrong!
> By contrast, Apple and Samsung are better positioned to navigate this crisis. As smaller and low-end-positioned Android vendors struggle with rising costs, Apple and Samsung could not only weather the storm but potentially expand market share as the competitive landscape tightens.”
[1]: https://www.mooreslawisdead.com/post/sam-altman-s-dirty-dram...
Not the person Sam Altman specifically, but AI in general. It was obvious even in 2024 that braindead beancounters were jumping on the hype train, so much so that coal power plants were kept alive to satiate the power hunger [1]. The last time that shit happened, it was the coin craze [2], but unlike cryptocurrencies there was and is an actual product being made...
[1] https://www.theregister.com/2024/10/14/ai_datacenters_coal/
[2] https://www.theguardian.com/technology/2022/feb/18/bitcoin-m...
And I'd say if it ends up being shown there even is the slightest hint of impropriety going on, trial him. Up to and including capital punishment for the entire board and C level - what OpenAI already has done, even if legally on paper, IMHO is the biggest market manipulation in history, and it's not just one competitor that is suffering but society as a whole.
I don't have an issue with big companies and their super rich investors engaging in petty bitch fights. By all means, hand me some popcorn and soda. But the RAM situation, with everyone not being super rich and flush with cash from AI crazed investors being screwed royally? That is far beyond acceptable.
We need to send a message: you can't mess around with the world economy at that level without feeling serious repercussions. The lives of the billions are not playthings for the select few.
And if it turns out to be outright market manipulation, engaging in deals he doesn't even have the money committed for by others, much less actually have it on his balance sheet? Then it's time for the pitchforks, not even Madoff was this ruthless.
I'm paying more on ebay for thinkcentre tiny and thinkpads - 12th gen intel and newer.
Refurbished spinny drives have been steadily climbing - up 50% since late last year. That's on top of the 20% mystery jump that happened in the last week of 2024.
Also be aware that this stuff whipsaws, if OpenAI actually takes posession of that memory and decides they can't use it and dumps, we're going to see a crash. Likewise if they back out of the deals with the memory fabs (or fail and default). There's some scary volatility on the horizon.
It's that everything has become 20% more expensive in the past year, I'm being taxed to death, fighting with companies trying to money grab me, my electric bill is now $800, and I'm now too broke to buy a new phone every 2 years when most of my income gets eaten by the "system".
I'll wait until either SPY does another 50% run or BTC does another 100% run and then I'll buy a new phone. Google, you want me to buy your new phone? Do something to make SPY or BTC go up and then we'll talk. Until then my current phone works, and the new features aren't a must-have.
If the "system" wants to drive more consumption, it's on the "system" to put more buying power into my hands. Double my salary, reduce my taxes, make BTC do a big run up, something. Otherwise I'm happy staying put.
After all this churn subsides there is a chance entry level Windows laptops will start at 32GB RAM and maybe 8-12GB VRAM?
Which could end up being about 5-10-15 years of progress packed into 2-3-4.
The shortage is manufactured, I have my doubts it will "end" in a conventional sense. I'm more skeptical and feel like this is yet another consolidation of wealth and a means of taking away compute power from people, which prevents startup competition. This way the hyperscalers are the only ones that can offer any meaningful compute.