I was in a Hard Off (Japanese used electronics store) just a week ago and found 10s of 8GB DDR4 ram sticks for around 1600 yen each (something like $10 USD). Some were ecc but others looked like ram modules from office pcs or something. It was in a rural area but still I would have thought they knew about the price hike. I guess not. Anyway I didn’t buy any so idk if they were working.
Some people get confused, but they are owned by different companies.
The founders know each other, has some relationship in business, and the company behind Hard Off owns shares of the company behind Book Off.
Yep, it's a chain with a bunch of different brands.
Biggest ones are Book Off (books, comics...), Hard Off (electronics, computers, musical instruments...) and Hobby Off (toys, collectibles, video games...).
They even have a Liquor Off ! (not second hand, just discount/overstock)
Sure, except this time also add on US pressuring Korean companies to not sell their old equipment to Chinese manufacturers, so the supply could actually keep up. But no no, it's all OpenAI's fault for behaving like the capitalistic swines they are. Both suck, but one has a long-lasting impact, the other is just what capitalists always done, and will continue in the future.
Agreed - it's more likely to lead to poor user experience than anything else. Most end-user facing software/service companies will probably bet on the DRAM price peak being temporary.
Software takes a lot of time to build. Codebases live for decades. There's often an impossibly large cost in starting over with a less wasteful architecture/language/etc. Think going from an Electron/Chromium app to something built using some compiled language and native OS GUI constructs that uses 10x less resources.
The impossibly large cost is the difference between hardware and software returns.
Hardware by nature forces redesigns whereas software it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so. That's why hardware is 10,000x faster than 30 years ago, and even simple word processors are debatabely faster than 30 years ago. Maybe even slower.
Hardware isn't much better actually. There isn't a good way I can show you this, but every x64 CPU contains an entire ARM CPU whose job is to initialize the x64 CPU. And of course it runs two operating systems - TrustZone and Minix.
The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64.
> The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
Well, what if I want to run a 16-bit OS?
Also, I wonder if the transistor count of a literal entire 8086 processor is so small relative to the total that they just do that.
Compatibility mode doesn't work by having a separate 16-bit core. It's random bits of spaghetti logic to make the core work like a 16-bit core when the 32-bit flag isn't set.
It began life as an "out of band" way to administer servers so that an ops. team could do everything (other than actual hardware changes) remotely that would otherwise need a person to be standing in front of the server in the datacenter poking commands into a keyboard.
It then grew in responsibilities to also support the "secure boot" aspect of system startup, and beyond some Intel CPU version point (I do not remember which point), it exists in every Intel CPU produced.
There isn't a separate 8086 core in every x64 core. The whole core has "if/then" spaghetti logic scattered throughout to alter its behaviour based on being in 16-bit mode.
At best, they might have been able to confine the needed logic patches to the instruction-decoding front end.
I remember being flabbergasted when I worked at the open source development lab and we got our first itanium system in, a multi-core, multi-rack nec system, with its own windows pc to boot up in order to get to linux.
Has anyone noticed their employer actually cutting back on employee hardware purchases? Because if new laptops are still being handed over to developers on a 2–4 year cycle, then probably not.
Game developers might have to do something though if high-end GPUs are going to end up being $5000.
At least so far the RTX 5090 seems to be available and at the same price it’s been at for the past six months (around $3000). I’m not sure when you’d see GPUs affected by the RAM price increases.
Higher in some respects (bandwidth), lower in others (latency, even though ordinary DDR5 is already no speed demon there and LPDDR5 is worse). At least from the spec sheet, these kinds of RAM are so different that I don’t really understand how demand for one can cause a shortage of the other, unless they are competing for the same manufacturing lines.
FWIW: GDDR is not higher latency than DDR. It just seems that way because the GDDR interface clock is much higher, so the number of clocks are higher too. But in terms of wall clock time, the latency is very similar.
Which makes sense: the latency is determined by the underlying storage technology and the way to access that storage, which the same for both.
I've declined the refresh I've overdue for. My 2021 model MBP has 32gb and a 1TB SSD. They're currently giving out the base model Air: 16gb and 256gb. No thanks.
We used to get whatever was most appropriate for our role, now we get whatever is cheapest.
> Consumer PCs and laptops spend most of their time idle
Not when Windows gets its grubby mitts on them. I will frequently hear the fans spin up on my Win10 laptop when it should be doing nothing, only to find the Windows Telemetry process or the Search Indexer using an entire fucking CPU core.
It's like with cars - better performing drive trains (et al) is used to increase the power envelope instead of lowering fuel consumption, since that leads to more sales allegedly.
It really isn't. I have a pocket-sized device that would utterly thrash a supercomputer from a couple of decades ago, and it goes a day or two on a 20Wh battery. Going full blast it'll consume maybe 25-30W, which is less than the idle power consumption of far less powerful devices from not all that long ago.
Incidentally, cars are also a lot more fuel efficient these days than they used to be.
To be fair, we could also just optimize the runtime engines for interpreted languages.
I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.
When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.
No more evictions for me!
The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.
But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.
I actually compared WASM to Javascript for a particular integer-math-heavy task. For a single run, Javascript beat out WASM because WASM had a lot more setup time. After running both 1000 times, they were almost equal in runtime.
Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)
Your mental model of integer vs double performance sounds outdated by decades. I’d suggest reading up on instruction performance on agnerfog, should be eye opening.
A benchmark of adding numbers doesn’t tell you how it performs on real world websites and codebases. I wouldn’t be surprised if JavaScript was still very competitive, simply because of how good V8 is, but I don’t think we can conclude anything from your benchmark.
Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.
It wasn't some dummy "add numbers" loop, this was doing math (multiply-add) on large 336-bit integers.
Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.
V8 means javascript can be fast. However no amount of optimization can get around inefficient code. There is only so much optimizes can do about too many layers of abstraction, calculations that are used but not needed, and nested loops. Someone needs to step back once in a while and fix bottlenecks to make things fast.
I'm typing this on my primary PC with a twelve year old CPU.
Make the cutoff 2026. If you really need more cycles than we have today to solve a problem you're doing something wrong! Stop creating waste and forcing us to buy new trash all of the time.
Sadly, I wouldn't be surprised if you get your wish. The consensus among those with power and influence seems to be that there is now too much computing power in the hands of the common people -- or at least too much RAM -- and it's time to bring the fire back up the mountain.
Gonna need a great big citation on that, superchief. I recently had to upgrade a friends' kiddos' PC because Discord simply could not function with a MERE 8GB of RAM.
> I recently had to upgrade a friends' kiddos' PC because Discord simply could not function with a MERE 8GB of RAM.
I have an old laptop with 8GB of RAM and an ancient CPU that I haul around with I when something small for basic work. I can run Discord, Visual Studio Code, and Chrome just fine.
Something else was going on with that PC.
Or the kid did an excellent job of socially engineering his parents into an upgrade.
I hung on to my 15 (?) year-old Intel motherboard, CPU, and 16 GBs of RAM; mostly because of e-waste guilt. I cannot believe this has value but here we are.
I also wish I built a new gaming rig during the summer last year.
Almost any working PC has some value. I've sold nearly 20 year old Intel Core2Duo systems, old 1TB hard drives and lots of other old components for about 10 - 20 dollars each. My primary gaming PC is from 2011.
PC gaming is a small single digit percentage of the total volatile memory market. Neither Samsung or micron or Hynix currently have any incentives to increase production to address the shortage and lowering prices for that segment of the market. It’s just not a money maker for them.
I have an old PC that I built that I’ve been meaning to take to Hard Off. I wonder if it would if it would be better to sell the parts individually, or as a single unit. For reference, it has a Ryzen 9 3900x, 64 GB DDR4, an RX 5700 XT graphics card, and 512 GB NVME.
Probably not worth much anymore, and I’ll probably just sell it as-is to save myself the effort of taking it apart. But Hard Off occasionally surprises me with how much they pay for a piece of old gear.
Yeah it’s still totally usable, but I just haven’t booted it up more than a handful of times in the past couple years. I built it for software development on Linux (with a little extra graphics capability for occasional gaming) in the era when Apple was making those terrible “touchbar” keyboards. But have since switched back to a MacBook (and a Switch 2 for gaming), and now it’s just taking up space.
Wow. Your old PC is more powerful than any computer I own. Faster version of my CPU, double the RAM, newer generation GPU. A fraction of the storage, though. I built it in 2020 to replace the desktop that I built in 2008.
That's part of the reason I want to sell it - and hopefully get it into the hands of someone who can give it a second life before it becomes e-waste. I feel a bit guilty every time I think about it sitting and collecting dust.
The specs are still more than enough for any of the development I do. My main issue is just the form factor - a laptop is so much better for my current situation. The power consumption also kind of bugs me - things have improved a LOT in that regard since 2019 - although it's kind of nice for heating my office in winter. It also doesn't help that I built it with a full ATX motherboard and a giant case (Fractal Design Define R6), which is kind of ugly and takes up a ton of space.
They want to buy it from you for peanuts, is what is missing from the article. Softmap buy your stuff at 1/10 of the actual original value and then sells it back to people at 5-6x what they bought it for.
I have never found Japan the home of gaming pc's anyway. It isn't quite like Seoul in that respect. I have shopped in Akibahara frequently in the last decade and noticed some PC gaming exclusive shops pop up and also go, including range of stock varying.
Akihabara was where runoffs and spillovers from cutting edge electronics accumulated until ~2010s. There used to be tons of failed experiments and EOL'd enterprise gears washing up. After that, supplies shifted to Taiwan and then to Shenzhen in China.
As far as gaming is concerned, the "gaming" parts of Akihabara mainly concerned locally produced pastel toned 2D slideshow pornographic games("visual novels"), the genre that lead to gacha games like FGO. Local populace is horrible at handling 3D first-person content in general and that never helped.
Non-console gaming in Japan is growing somewhat but a lot is also going into phones, namely Genshin. So where the trend is headed is still pastel toned soft-porn games without much PvP.
Phone games seem to have been popular for a good decade. I find the extravagant advertistments in stations like Shinjuku are always rather impressive but I never quite get into any of those games like Genshin and Honkai.
Those are social games. They don't make sense for someone not in a player community for a specific title on Twitter/Discord or in physical classrooms/breakrooms.
That is a good point. I guess they are WoW esque but don't have the in app chat in the same way, and I am not in that community. Most of my Japanese friends are arcade types and I would go and play driving games with them like Initial D.
In my experience akihabara was never great for PC hardware but other places are in Tokyo are. Also Osaka had some good spots for pc hardware even in the stereotypical spots like den den town.
Oh I always preferred den den on the few occasions I have visited Osaka. I find it better value and less glossy. Saying this I did used to love a used Thinkpad shop around the back of Sofmap in Akiba. Very good value. Captain PC or something if I remember rightly. It shut I think in 2021 which was sad.
Biggest ones are Book Off (books, comics...), Hard Off (electronics, computers, musical instruments...) and Hobby Off (toys, collectibles, video games...).
They even have a Liquor Off ! (not second hand, just discount/overstock)
Also: mode off (fashion).
See https://www.hardoff.co.jp/shop/brand/offhouse/
Software takes a lot of time to build. Codebases live for decades. There's often an impossibly large cost in starting over with a less wasteful architecture/language/etc. Think going from an Electron/Chromium app to something built using some compiled language and native OS GUI constructs that uses 10x less resources.
Hardware by nature forces redesigns whereas software it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so. That's why hardware is 10,000x faster than 30 years ago, and even simple word processors are debatabely faster than 30 years ago. Maybe even slower.
The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64.
Well, what if I want to run a 16-bit OS?
Also, I wonder if the transistor count of a literal entire 8086 processor is so small relative to the total that they just do that.
According to https://en.wikipedia.org/wiki/Transistor_count#Microprocesso...:
So you could fit 200,000+ 8086s on that not-so-cutting-edge silicon.Compatibility mode doesn't work by having a separate 16-bit core. It's random bits of spaghetti logic to make the core work like a 16-bit core when the 32-bit flag isn't set.
First I'm learning about this and I'm curious why this needs to be the case? Seems so wild that it works this way, but I'm sure there's a logic to it.
It began life as an "out of band" way to administer servers so that an ops. team could do everything (other than actual hardware changes) remotely that would otherwise need a person to be standing in front of the server in the datacenter poking commands into a keyboard.
It then grew in responsibilities to also support the "secure boot" aspect of system startup, and beyond some Intel CPU version point (I do not remember which point), it exists in every Intel CPU produced.
> it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so
At best, they might have been able to confine the needed logic patches to the instruction-decoding front end.
What does the ARM CPU do?
Game developers might have to do something though if high-end GPUs are going to end up being $5000.
Which makes sense: the latency is determined by the underlying storage technology and the way to access that storage, which the same for both.
RAM manufacturers are switching lines over from DDR to make HBM.
https://www.techpowerup.com/344578/leaks-predict-usd-5000-rt...
I've declined the refresh I've overdue for. My 2021 model MBP has 32gb and a 1TB SSD. They're currently giving out the base model Air: 16gb and 256gb. No thanks.
We used to get whatever was most appropriate for our role, now we get whatever is cheapest.
Idle power usage is what matters.
The 15 seconds it takes to launch Discord and install updates isn't going to be driving the overall efficiency of your PC.
Not when Windows gets its grubby mitts on them. I will frequently hear the fans spin up on my Win10 laptop when it should be doing nothing, only to find the Windows Telemetry process or the Search Indexer using an entire fucking CPU core.
Incidentally, cars are also a lot more fuel efficient these days than they used to be.
I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.
When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.
No more evictions for me!
The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.
But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.
this way all the RAM that AI data centers scoop up will be used to lessen demand for RAM that those same datacenters created
net-zero RAM!
Nodejs and Python were used in 2012, why is now any different?
Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)
Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.
[1] - https://hacks.mozilla.org/2018/01/oxidizing-source-maps-with...
[2] - https://mrale.ph/blog/2018/02/03/maybe-you-dont-need-rust-to...
Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.
Make the cutoff 2026. If you really need more cycles than we have today to solve a problem you're doing something wrong! Stop creating waste and forcing us to buy new trash all of the time.
I have an old laptop with 8GB of RAM and an ancient CPU that I haul around with I when something small for basic work. I can run Discord, Visual Studio Code, and Chrome just fine.
Something else was going on with that PC.
Or the kid did an excellent job of socially engineering his parents into an upgrade.
I also wish I built a new gaming rig during the summer last year.
That's a perfectly fine modern PC. Similar to or faster than what some of my casual gaming friends use.
The RAM alone would probably sell for $350 or more.
Wow RAM prices have gotten absolutely insane.
The specs are still more than enough for any of the development I do. My main issue is just the form factor - a laptop is so much better for my current situation. The power consumption also kind of bugs me - things have improved a LOT in that regard since 2019 - although it's kind of nice for heating my office in winter. It also doesn't help that I built it with a full ATX motherboard and a giant case (Fractal Design Define R6), which is kind of ugly and takes up a ton of space.
As far as gaming is concerned, the "gaming" parts of Akihabara mainly concerned locally produced pastel toned 2D slideshow pornographic games("visual novels"), the genre that lead to gacha games like FGO. Local populace is horrible at handling 3D first-person content in general and that never helped.
Non-console gaming in Japan is growing somewhat but a lot is also going into phones, namely Genshin. So where the trend is headed is still pastel toned soft-porn games without much PvP.