{ datagubbe }


datagubbe.se » nothing is new

Nothing is New

On inventions, improvements and stagnation in the software world.

Early 2021

But 1983 - and the whole decade - came and went without the "new thing".
-- Alan C. Kay in The Early History of Smalltalk

I freely admit it: I'm likely to be the first person to triumphantly yell "Exactly!" at any statement claiming that computers were better before, and that everything is pretty much crap today. Frankly, I'm kind of a backwards person: at some point I started enjoying computers more because of their rich and interesting history, rather than the magical air of futuristic high-tech that first drew me to them.

Consequently, I thought I'd agree with a blog post titled "The Great Software Stagnation" - written by Jonathan Edwards, a former research fellow at MIT, no less!

Alas, I did not.

For those too lazy to click on links, Edwards roughly states that before 1996, lots of amazing "software technology" appeared, such as C++, Windows, Unix and WWW. Then the Internet boom happened and "programmers could get rich quick", which effectively stopped "progress" in the field and replaced it with "incremental improvements" that aren't "fundamentally new".

The problem with this reasoning, in my opinion, is twofold: Firstly, 1996 is far too late a date to use as a cutoff. Secondly, almost anything in any field at any time can, upon close examination, be described as derivative of something that appeared earlier. (The wonderfully lifelike, masterfully stylistic and cleverly shaded paintings in the Chauvet Cave comes to mind as an example within art.) This historical backtracking is especially applicable to sufficiently complex technology - but that doesn't necessarily mean something isn't inventive in its own right.

Inventions and improvements

Who invented the automobile, for example? Without doubt it's a truly revolutionary creation, yet it's also mostly a joining together of already existing concepts. Nicolas-Joseph Cugnot constructed his steam powered "Fardier à vapeur" in 1769, but Ferdinand Verbiest is credited with thinking up the concept a hundred years earlier. Of course, Cugnot couldn't have built his vehicle without the invention and gradual refinement of steam power, with which experiments are recorded as early as during biblical times. And, besides - isn't a car really just an incremental improvement on the horse and carriage?

It wouldn't be controversial to say that the iPhone is a post-1996 invention with considerable impact on a number of industries. On the software side of things, it introduced a new operating system with a new paradigm for interacting with a computational device. Of course, neither smartphones nor capacitive touchscreens were new in 2007 and Apple wasn't even first with combining the two. Thus, arguably, the iPhone was simply an incremental improvement on decades-old mobile phones - and yet it was, of course, something completely new. Anyone not classifying it as a "radical breakthrough" is surely rather alone in that assessment.

Truly atomic and easily identifiable inventions are few, far between and often date back a very long time. They're also often discoveries as much as they are inventions. Some breakthroughs are old ideas that finally become reality thanks to other inventions. I'm pretty sure self-driving cars have been a childhood dream for many since at the very least Knight Rider in 1982 - and da Vinci's (non-working) concepts for flying machines from the late 1400:s have become iconic.

It's probably been done before

Edwards lists a number of pre-1996 technologies that, in the context of his text, can only be interpreted as what he considers to be major inventions or breakthroughs. On this point, I agree. I'm not writing this to discredit these inventions, or to belittle their inventors - I spend almost every day standing on the shoulders of these giants.

Most of what's mentioned in the list can however still be traced back to something similar that predates it. Lisp, for example, is a curiously elegant high level language considering it was invented in the 1950:s. The truth is of course that Lisp, despite featuring a lot of novel concepts, didn't appear in a vacuum. Its elegance derives heavily from mathematical notation and, later, lambda calculus. Other key concepts can be found in an earlier language called IPL (Garbage collection however, is as far as I know, all Lisp).

The same can be said about Unix, which builds heavily on concepts in Multics, which in turn builds on earlier timesharing systems. Windows 1.0 was pretty much crap compared to its chronological predecessor, Apple's Lisa OS. The story of the Lisa itself is no secret, either: Steve Jobs went to Xerox PARC in 1979, took a good look at the Alto and Smalltalk, and liked what he saw.

Xerox are often blamed for not understanding what they had and giving away their killer concept to Jobs, but that's not entirely true. They launched the Star in 1981, two years before the Lisa. It was the first commercial machine with a desktop metaphor. This was derived from the Smalltalk GUI, which was inspired by Douglas Engelbart's NLS (as featured in The Mother of All Demos), which was inspired by Sketchpad. Smalltalk as a language builds in part on Simula, which introduced a lot of the concepts of modern object oriented programming. C++ is heavily influenced by Simula and, of course, C.

Actual origins

This is where we can start identifying some kind of chronological tipping point. Simula, Lisp, Sketchpad and Engelbart's NLS - together with Algol, Fortran, Basic, artificial intelligence, multitasking/timesharing, networking, full screen text editors, relational databases and hypertext - all appear in a seemingly golden era between roughly 1955 and 1969.

This was the infancy of the digital general purpose computer. The ENIAC, being the first (although - of course - drawing heavily on earlier machines and research), was constructed in 1945. A decade later, the use of such machines had spread to comparably wider circles, which not only tickled the imaginations of some very fine visionary minds but also in a more hands-on sense called for simpler and quicker software development practices.

Post-1996 examples

The Internet boom irrefutably happened some time during the middle of the 1990s, so I agree that its effects began to clearly materialize in 1996. It's true that most of the makings of current software technology were already in place by then, but they too build on earlier inventions and improvements.

From his list of "new" things, Edwards omits CSS, XML (both from 1996), XMLHttpRequest (introduced in 2000) and JSON (conceived sometime shortly thereafter). This quartet forms the basis of the modern web and enabled things like single page applications. Of course one could quite correctly argue that these are simply incremental improvements upon earlier concepts, but that's also true of the web itself: Engelbart demonstrated clickable links in 1968, laying the foundation for several following hypertext systems.

JavaScript hardly appeared out of thin air, either. It was influenced by a plethora of other languages and the concept of embedded scripting was already widely spread. Microsoft's VBA, for example, appeared two years earlier.

The combination of HTML, JavaScript and the DOM can be seen as a continuation of PostScript, a Turing complete document formatting and presentation language.

Cashing in

The notion of programming as a get rich quick scheme was surely helped by the dotcom bubble, but seemingly unlikely overnight success in computing predates the Internet boom by more than a decade. I suggest that it's linked to the home computer revolution rather than the Eternal September. Bill Gates (no introduction needed), Ken Williams (Sierra On-Line), John Carmack (Doom) and Gary Kildall (CP/M) are all examples of this, but there are many more success stories from this time. The computer game Lemmings, for example, was released in 1991 and has since sold a whopping 20 million copies.

I also think Edwards inadvertently stumbles into the trap of survivor bias. Just like with early home computer programmers, most web developers didn't - and still don't - get rich. For every success story there's also a massive amount of failures - some of them so spectacular they've become legends in their own right.

Trickle down inventions

Claiming that we're still using a 50 year old OS with 30 year old text editors to write code in 25 year old languages can be tempting - but as I hope I've demonstrated above, the roots of almost all of those can be traced even further back and Edwards' choice of baseline technologies can just as easily be dismissed as mere incremental improvements.

The fact that a contemporary programmer can get work done on a 30 or even 50 year old system doesn't mean they necessarily want to: there has been significant progress since then, albeit in - yes - increments.

Editor-wise, a programmer of today sent back to the early 1990:s would probably miss things like multi-language support and integration with version control and linters. Even simple things like columnar marking, syntax highlighting and multiple document interfaces weren't standard editor features in 1990 - though the concepts were surely invented earlier. Python was released in 1991, but our time travelling coder is likely to be rather disappointed: list comprehension, while itself nothing new, wasn't introduced in the language until nine years later, in 2000.

In fact, most programmers probably didn't even use a multitasking operating system in 1990: Some improvements trickle down slower than others, in part because of hardware limitations, in part because of the curious whims of the software market. From a vantage point at MIT, it might seem as if little has changed during the last few decades - but try telling that to the poor sod tasked with porting Lemmings to the ZX Spectrum.

The possibility of being wrong

I think that if anything is "bullshit", it's the dead certain conclusion that we've lost the will to improve and that nothing of significance has emerged since 1996. Brushing away machine learning as "a diffrent type of software" is convenient for the sake of Edwards' argument, but in reality it's driving steadily improving automated programming - which I'd argue is something decidedly related to software development.

Since most technological innovation builds on earlier concepts, it's easy to at least joke about nothing being new: Cloud computing is really just fancy timesharing, containerization is really just a fancy chroot, microservices is really just the Unix philosophy, and so on. Then again, I'm neither an expert in computer history nor in the history of ideas. Perhaps it's true that software innovation is stagnant and that nothing of significance has emerged during the last 25 years.

If that's the case, then maybe it's in part due to the regular appearance of people such as myself1 or Edwards, all too eager to confidently tell excited youngsters that whatever they're fired up about is probably just a rehashing of something old. In fact, even complaining about the lack of software invention was old hat by 1996. Alan C. Kay did so - very eloquently - already in 1993.



1 Let's be honest, my whole site is a perfect example of such behavior.