Crossing the human/machine rubicon

Over the last decade or so we’ve passed a tipping point in the relationship between humans and machines without even noticing.

As an engineer and developer of technology, I well remember the “good old days” of the 1980’s and 1990’s when it was common to devote a significant amount of human time in order to optimise software or hardware, so that we’d really make the most of it. We worked to minimise processing cycles and storage requirements in order to either make our solution possible at all (I worked on early digital audio and video editing systems, which had to push available technologies right to the bleeding edge) or to make them affordable. It was worth skilled individuals burning the midnight oil for weeks on end to microcode cunning solutions, or pack code into very tight spaces – the popular ZX81 home computer of the 1980’s had a grand total of 1K bytes of volatile memory, which included the display!

However, Moore’s Law (a specific case of the more general Learning Curve) “gives” us processing that works twice as fast every 18 months. And storage becomes twice as cheap every 12 months. The cumulative effect over the decades is truly astonishing, as the miracle of compound interest – “two to the power X” – just keeps on ticking up, year by year, like grains of rice doubling on a chessboard. Resources that would have been literally inconceivable in the 1980’s are today mundane. The fact that a 64GByte micro-SDcard costs almost nothing and is the size of my thumbnail would have boggled the mind of even a die-hard futurist from the 1980’s. Even as someone who has designed chips I find it pretty amazing that we can deliver so many bits at all, let alone with such fantastic reliability, cost and size. And of course thanks to Cloud computing, I now don’t even need to physically see (or even own) these resources in order to use them, which means that they can be swapped-out to take advantage of Moore’s Law without my knowledge.

This is transformative because a massive boost in quantity has somehow led to a shift in the quality of our experience. Over the past 15 years we’ve seen a transformation in how software engineering is done. Prior to that we used Waterfall planning, with Gantt charts and so on, and the results were frankly poor: projects would overrun and by the time they arrived had diverged from market needs. It wasn’t much fun working in that environment either. So we don’t do things that way any more – now in the era of CI/CD and Agile, we try to deliver something that works as quickly as possible, and then iterate based on user feedback. This really helps keep the product converged with the (changing) market need.

This in turn has created an increasing requirement to deliver high quality software rapidly. Partly that can be addressed with these new methodologies. But partly it is also being addressed by a fundamental shift in our approach to development. Increasingly we are choosing our tools (our compilers, our development and deployment frameworks) not on the basis of their efficiency in machine terms (KB, CPU cycles) but in human terms. If a machine has to sweat twice as hard in order to make it twice as easy for a human to deliver great code quickly, then that’s a great trade-off, because by next year the machine’s performance will have reached where it could have been optimised to anyway, and in that time our (finite, non-Moore’s-law-scalable) human has got on and done other things with their precious time.

We see this in programming languages, where “high level” languages such as Python, Java, Go, Ruby, Javascript etc. (which are interpreted and thus inherently less efficient) have now almost completely replaced C (just as it replaced assembler in the 1980’s), because they’re easier to develop with and their run-time checking can ensure quality and security without human effort – even though there is a very significant performance hit. Human productivity trumps machine productivity. Abstraction wins.

And of course machines can scale in other dimensions than just Moore’s Law, too. If many developers use a particular programming language or framework, it is worth someone (a human or even a machine) investing time to work-out what the performance bottlenecks are and solving them (e.g. with compile-on-demand), which investment is then repaid by millions of uses.
People talk about the so-called fourth industrial revolution, and there is significant fear about the “rise of the machines”, with recent media stories focussing on the replacement of humans by Artificial Intelligence. This is analogous to the way in which the muscles of the labourers were replaced by steam power in the first industrial revolution. At one level, this is quite true – evolution doesn’t stand still, everyone and everything has forever been locked in a competition to be more relevant than their peers. Differentiate! Add more value! Avoid becoming a commodity!

But on the other hand as machines multiply, humans become ever more finite with respect to them, and thus potentially precious. In the world of the Internet of Things we see this too – every year, the number of connected devices grows by about 25%, but the number of humans is growing much more slowly. So the number of devices-per-person is growing at about 10x per decade. As the end-users of these machines, our time, our attention, is therefore becoming more and more precious, per device. And so it increasingly worth exploring how machine cycles can be burnt in order to reduce human cycles – Google’s pre-emptive search every time you type a character is an example of this kind of tradeoff in action.

So where does it all end? Is mankind destined to become enslaved, or irrelevant, in a machine-dominated future? That’s one possibility, and continued economic disruption seems as certain today as it did 100 years ago. But an outcome that seems more likely to me is that we humans – the masters of adaptation – will continue to mold ourselves around our creations, augmenting ourselves using our machines. We are increasingly outsourcing parts of our brain to the machines, and so even as we continually redefine ourselves as what they are not, we could indeed one day find that the machines have won – but that they are us.