Do computers need to get faster?

Some history

If you’ve been paying attention to computer hardware for even just 15 years, you remember the days of big performance leaps generation to generation. If you’re familiar with hardware even older than that, then you’re probably also familiar to the now long dead Moore’s Law.

Against the ceiling

However, look at the hardware from the last 5 years and raw performance gains in the world of CPUs have been rather modest. GPUs have seen bigger improvements, but at what cost. Despite initially being praised for breaking the curse plaguing x86 chips, Apple’s in-house ARM-based M chips have seen a similar trend: either marginal performance gains barely worth a footnote, or more substantial performance gains with a hiked up power usage.

Now of course, this isn’t the whole story, most new CPUs and GPUs have also been shipping with more and more specialised accelerators: processors that can only do a relatively small number of tasks, but do them much faster than a general purpose processor like traditional CPUs. Those accelerators have primarily been for video decoding, and drum roll AI!

Number must go up anyway… But why?

So with all of that, it really seems physics isn’t going to stop hardware manufacturers from making the numbers go bigly big. But today I’m feeling heretic, so I must ask: does hardware need to get faster?

I mean, think about it, do YOU even take full advantage of your current hardware? I’ll guess that a lot of you, even with older hardware, answered no. I certainly don’t get everything out of my 2 year old laptop with a Ryzen 5800U, and I’m what some would call a power user: games and compiling code are standard for me.

So you can imagine that the average Joe who only uses a computer for taxes and maybe some Netflix at the end of the week could probably do with even slower hardware without noticing it’s slower (assuming the software’s not bloated as hell, more on that later). Truth it, computer performance exceeded the needs of most a long time ago. If it can run some office software and play YouTube and Netflix, it’s fast enough for most people. Even gamers probably haven’t noticed much of a difference since Nvidia’s GTX10-series and AMD’s Vega GPUs outside of gimmicks like ray tracing, and an ever growing hole in their pocket every time they upgrade.

Now a lot of you might point at your old computer from 2015 and talk about how slow it is, but to that I ask: is it because the hardware is slow, or because the software got slower than when you bought it?

“What Andy giveth, Bill taketh away”

Wirth’s Law, also sometimes humorously referred to as Andy and Bill’s Law states that software gets slower more rapidly than hardware gets faster. This also has an interesting side effect: hardware that was perfectly fine just a couple years ago now feels too slow to be usable. Does this sound familiar?

Personal anecdote time! My first Android phone actually felt quite fast when I first got it: it could play all the games I could find on the Play Store just fine. But give it about 3 years and even instant messaging applications were unbearably slow, to the point it was outright unusable.

Next phone, same story: at first it felt incredibly fast, but give it a few years and once again basic tasks were out of the question: web browsers would run out of memory and crash, and basic applications such as instant messaging clients would run really slowly.

And one last example: around 2010 I got a netbook. It was, well, quite crap as all netbooks were, but it was still good enough to play YouTube without skipping frames… but skip 5 years, and by 2015 it couldn’t even play 480p YouTube without looking like it was dying.

Was this the hardware getting slower? No, in all three cases, it was the software that became unbearably slow. Programs simply ended up using more and more resources to the point they could no longer run on the hardware that broke no sweat running them before. This leads to hardware that is still perfectly capable going in a dumpster (figuratively, but sometimes also literally), as well as to a worse experience for those who want to keep using old hardware.

“Breathe new life into your old PC”

I’m sure you’ve seen that tagline more times than you bothered to count. And if you’ve clicked on it, be it on some tech tips website or a youtube video, I’m sure you’ve also noticed that the solution often times involved Linux. Coincidence? No, not really.

As it turns out, Wirth’s Law hasn’t affected Linux to anywhere the same extent as Microsoft’s Windows. And of course, Linux isn’t the only one to escape the curse, most BSDs have fared just as well if not better, still perfectly capable of running smoothly on hardware that would be considered junk in the Windows world.

Now why do I bring this up? Just as a final point to prove that hardware doesn’t get slower, bad software does.

But why does this happen?

Well, a few reasons. By far the biggest reason is Jevons paradox: in software terms, this means developers get less and less concerned about efficient resource usage, instead going to town wasting resources on useless glitter.

Now whether this is a reason to uphold Wirth’s Law, or merely a happy little accident (for corporations) is up for debate, but old hardware “slowing down” ends up incentivising end users to buy newer hardware every few years even if the old hardware they have on hand would still be perfectly capable, had software not slowed it down. I’m not going to say this is the reason for starting this trend, but it’s definitely a reason to not stop it.