6 comments

  • xnx 54 minutes ago
    The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).

    Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.

    • st_goliath 22 minutes ago
      > The Megahertz Wars were an exciting time.

      About a week ago, completely out of the blue, YouTube recommended this old gem to me: https://www.youtube.com/watch?v=z0jQZxH7NgM

      A Pentium 4, overclocked to 5GHz with liquid nitrogen cooling.

      Watching this was such an amazing throwback. I remember clearly the last time I saw it, which was when an excited friend showed it to me on a PC at our schools library. A year or so before YouTube even existed.

      By late 2005, my Pentium 4 Prescott at home had some 3.6GHz without overclocking, 4GHz models for the consumer market were already announced (but plagued by delays), but surely 10GHz was "just a few more years away".

    • embedding-shape 17 minutes ago
      > Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.

      "Bananas" core-counts gave me the same experience. Some year ago I moved to Ryzen Threadripper and experienced similar "Wow, compiling this project is now 4x faster" or "processing this TBs of data is now 8x faster", but of course it's very specific to specific workloads where concurrency and parallism is thought of from the ground up, not a general 2x speed up in everything.

    • HPsquared 43 minutes ago
      SSDs were such a revolution though, and a really rewarding upgrade. I'd fit SSDs to friend and family computers as an upgrade.
      • micv 19 minutes ago
        Getting my first SSD was absolutely the best computer upgrade I've ever bought. I didn't even realise how annoying load times were because I was so used to them and coming from C64s and Amigas even spinning rust seemed fairly quick.

        It took a long time before I felt a need to improve my PC's performance again after that.

        • coffeebeqn 2 minutes ago
          There were quite a few mind blowing upgrades back in the day. The first sound card instead of PC beeper was one of my most memorable moments.

          I remember loading up Doom, plugging my shitty earplugs that had a barely long enough cable and hearing the “real” shotgun sound for the first time. Oo-wee

      • dcminter 8 minutes ago
        Just before I installed an SSD wa the last time I owned a computer that felt slow.
      • sigmoid10 29 minutes ago
        I once had a decade old Thinkpad that suddenly became my new work laptop once more thanks to an SSD. It's a true shame they simply don't make them like this anymore.
    • geon 15 minutes ago
      GPUs for 3d graphics were a game changer.

      I can see why you wouldn’t consider it as impactful if you weren’t into gaming at the time.

  • hedora 9 minutes ago
    The Athlon XP was the bigger milestone, as I remember it.

    They were both "seventh generation" according to their marketing, but you could get an entire GHz+ Athlon XP machine for much less than half the $990 tray price from the article.

    I distinctly remember the day work bought a 5 or 6 node cluster for $2000. (A local computer shop gave us a bulk discount and assembled it for them, so sadly, I didn't poke around inside the boxes much.)

    We had a Solaris workstation that retailed for $10K in the same office. Its per-core speed was comparable to one Athlon machine, so the cluster ran circles around it for our workload.

    Intel was completely missing in action at that point, despite being the market leader. They were about to release the Pentium 4, and didn't put anything decent out from then to the Core 2 Duo. (The Pentium 4 had high clock rates, but low instructions per cycle, so it didn't really matter. Then AMD beat Intel to market with 64 bit support.)

    I suspect history is in the process of repeating itself. My $550 AMD box happily runs Qwen 3.5 (32B parameters). An nvidia board that can run that costs > 4x as much.

  • Sharlin 57 minutes ago
    The i486DX 33MHz was introduced in May 1990. A 30x increase, or about five doublings, in clock speeds over ten years. That's of course not the whole truth; the Athlon could do much more in one cycle than the 486. In any case, in 2010 we clearly did not have 30GHz processors – by then, the era of exponentially rising clock speeds was very decidedly over. I bought an original quadcore i7 in 2009 and used it for the next fifteen years. In that time, roughly one doubling in the number of cores and one doubling in clock speeds occurred.
    • adrian_b 39 minutes ago
      "The era of exponentially rising clock speeds" was already over in 2003, when the 130-nm Pentium 4 reached 3.2GHz.

      All the later CMOS fabrication processes, starting with the 90-nm process (in 2004), have provided only very small improvements in the clock frequency, so that now, 23 years later after 2003, the desktop CPUs have not reached a double clock frequency yet.

      In the history of computers, the decade with the highest rate of clock frequency increase has been 1993 to 2003, during which the clock frequency has increased from 67 MHz in 1993 in the first Pentium, up to 3.2 GHz in the last Northwood Pentium 4. So the clock frequency had increased almost 50 times during that decade.

      For comparison, in the previous decade, 1983 to 1993, the clock frequency in mass-produced CPUs had increased only around 5 times, i.e. at a rate about 10 times slower than in the next decade.

    • layer8 38 minutes ago
      On the plus side, the 486DX-33 didn’t require active cooling. The second half of the 1990s was when home computing started to become noisy, and the art of trying to build silent PCs began.
  • dd_xplore 1 hour ago
    I remember back in 2006 I used to browse overclock forums to overclock my pentium 4, I tons of fun consuming lots of instructions, I learned the bios, changed PLL clocks, mem clocks etc.
    • rckclmbr 37 minutes ago
      I bought a car radiator and dremeled out my case, visited Home Depot for all the tubes and connectors. It’s too easy nowadays to add watercooling
  • mtucker502 1 hour ago
    What progress is being made in overcoming the current thermal limits blocking us from high clock rates (10Ghz+)?
    • HarHarVeryFunny 37 minutes ago
      What would be the benefit? You don't need a 10GHz processor to browse the web, or edit a spreadsheet, and in any case things like that are already multi-threaded.

      The current direction of adding more cores makes more sense, since this is really what CPU intensive programs generally need - more parallelism.

    • vessenes 39 minutes ago
      Like any doubling rule, the buck has to stop somewhere. Higher energy usage + smaller geometry means much more exotic analog physics to worry about in chips. I’m not a silicon engineer by any means but I’d expect 10Ghz cycles will be optical or very exotically cooled or not coming at us at all.
      • adrian_b 27 minutes ago
        Reaching 10 GHz for a CPU will never be done in silicon.

        It could be done if either silicon will be replaced with another semiconductor or semiconductors will be replaced with something else for making logical gates, e.g. with organic molecules, to be able to design a logical gate atom by atom.

        For the first variant, i.e. replacing silicon with another semiconductor, research is fairly advanced, but this would increase the fabrication cost so it will be done only when any methods for further improvements of silicon integrated circuits will become ineffective or too expensive, which is unlikely to happen earlier than a decade from now.

    • brennanpeterson 33 minutes ago
      None for normal.compute, since energy density is still fundamental. But the interesting option is cryogenic computing, which can have zero switching energy, and 10s of GHz clock rates

      Some neat startups to watch for in this space.

    • magic_man 59 minutes ago
      The energy consumed is cv^2f. It makes no sense to keep increasing frequency as you make power way worse.
      • vlovich123 55 minutes ago
        So heat. There’s efforts to switch to optics which don’t have that heat problem so much but have the problem that it’s really hard to build an optical transistor. + anywhere your interfacing with the electrical world you’re back to the heat problem.

        Maybe reversible computing will help unlock several more orders of magnitude of growth.

  • 1970-01-01 31 minutes ago
    Argh. The headline. The opener. Awful. Where are editors in 2026? There's no way an LLM would write this.

    The GHz barrier wasn't special. What was much more important was the fact that AMD was giving Intel a hard time and there was finally hard competition.

    • adrian_b 14 minutes ago
      In terms of marketing, the "GHz" barrier was special, because surpassing it has indeed created a lot of recognition in the general public for the fact that the AMD Athlon CPUs were better than the Intel Pentium III CPUs.

      In reality, of course what you say is true and the fact that Athlon could previde a few extra hundreds of MHz in the clock frequency was not decisive.

      Athlon had many improvements in microarchitecture in comparison with Pentium III, which ensured a much better performance even at equal clock frequency. For instance, Athlon was the first x86 CPU that was able to do both a floating-point multiplication and a floating-point addition in a single clock cycle. Pentium III, like all previous Intel Pentium CPUs required 2 clock cycles for this pair of operations.

      This much better floating-point performance of Athlon vs. Intel contrasted with the previous generation, where AMD K6 had competitive integer performance with Intel, but its floating-point performance was well below that of the various Intel Pentium models (which had hurt its performance in some games).

    • HarHarVeryFunny 17 minutes ago
      There was a time where increased clock speeds, or more generally increased processor throughput was important. I can remember when computers were slow, even for things like browsing the web (and not just because internet connection speeds were slow), and paying more for a new faster computer made sense. I think this time period may well have lasted roughly until the "GHz era" or thereabouts, after which even the cheapest, slowest, computers were all that anybody really needed, except for gamers where the the solution was a faster graphics card (which eventually lead to GPU-computing and the current AI revolution!)
      • 1970-01-01 11 minutes ago
        You're conflating a few things here. The Vista era was the biggest requirement hit. That was the time where people really needed a faster PC to continue browsing. Before that, you could get away with XP running on a sub-GHz processor.