A Hope For Reliving 1994
The repeating history of platform diversity and dreams of reviving the home computer.
Late 2020
Defining an era
Anyone who's experienced a certain era in computing is going to claim their experience was a special one, one that cannot quite be described in words: you had to be there, at that very time and place, to really understand it. I'm certainly no different.
My youthful computing heydays played out in Scandinavia during the middle of the 1990s. It was a magical time and place for anyone interested in computers. It was regrettably also the time when things started going sideways in computing, paving the way for today's dystopian surveillance economy - but I think I've written enough about that already, so let's move on to greener pastures.
Those of us who were there in, say, 1995, got to experience the end of an era - the end of the home computer. Because of that, we also got to experience one of the most amazing and eclectic mixes of hardware and software known to mankind: a spectacular diversity of computers and operating systems I dare say was completely unique and unmatched both before and - so far - afterwards.
What defines an era? What defines its beginning and end? For the purposes of my reasoning here, I'm going to settle for Merriam-Webster's definition: "a period identified by some prominent figure or characteristic feature." More importantly, I argue that the era of the home computer as we know it lasted from 1977, when the Commodore PET was released, until roughly the year 2000, when (with very few exceptions) the computers people used at home was indistinguishable from the ones they used at work. (Sorry, Apple fans - the Apple I required too much assembly to identify as a home computer.)
Granted, the home computer era started when most people didn't use computers at work at all, but if they did, it was most likely some kind of mainframe accessed through a terminal. Then, slowly but surely, came the age of office PC:s, meaning the IBM PC, XT and clones. For those who wanted a computer at home, the PC was surely a choice, but rarely the most reasonable one. It had questionable graphics capabilities, abysmal sound and most of all it had a price well past the pain point of most home consumers. In short, it simply didn't appeal to gamers and hobbyists.
Admittedly, neither the PET nor the original Apple II were very exciting in the sound, graphics and even price departments, but by the time IBM launched the XT in 1983, their competition had a head start in the form of the Apple IIe and the Commodore 64. And talk about era-defining machines: The latter was manufactured right up until Commodore's demise in 1994 and Apple produced IIe compatibility expansions well into 1995. Rightly so - they were curiously capable machines, their lifespans extended further by inventive and dedicated programmers whose ingenuity overcame seemingly any limitation. The C64, for example, only supported 40 columns of text. Nick Rossi, the author of communications package Novaterm, didn't let that stop him from implementing his own 80 column mode and a perfectly capable VT102 emulator.
Hardware investments
But what does early 1980s 8-bit micros and legacy compatibility have to do with hard- and software diversity in the mid-90s and a possible renaissance for the home computer? To understand this, we need to look at hardware and software as an investment on a personal level.
A decent home computer in 1990 would have been, for example, an Amiga 500, an Atari STe, maybe a 286 PC, or any other of the options available. To most people though, a much simpler model was more than enough. A second hand C64, for example, would still be an excellent choice for any high school gamer. For mere word processing, an even simpler machine sufficed - such as Amstrad's PCW, a monochrome green screen oddity bundled with LocoScript and a dot matrix printer.
The PCW is a perfect example of personal investment. Although it was relatively cheap compared to other computers, it was still costly: £399 was a lot more in 1985 than it is now. If you were a writer of any kind; a journalist, a novelist or just someone managing a small business, there really was no reason to update for many years to come unless the machine irreparably broke. Compatibility issues were pretty much nonexistent, because it was perfectly fine to hand in copy on paper and no upgrades were needed to cope with networking, because there weren't any essential networks to connect to. Banking was still done at the bank, shopping at the shop and television was watched on the TV.
After a couple of years with their machine, a PCW user was very familiar with it and LocoScript. Switching to another platform just for some vague performance improvement wasn't a selling argument, because it meant they'd have to learn to operate a new machine, a new OS and new word processing software. To anyone but enthusiasts, a computer was mainly a single use tool.
The principle of personal investment still holds true: Putting a sufficient amount of time, money and cognitive effort into something is what leads to seemingly unreasonable behaviors like holding on to Windows 7 even after it's reached end of life.
Performance longevity
In early 2012, I bought a mid-range Lenovo IdeaPad for home use. Featuring a dual core Intel i5 CPU, 4 gigs of RAM and a mechanical hard drive, this machine is still perfectly usable today, more than eight years later. It's certainly slower than a corresponding laptop model today and I've upgraded it with an SSD disk and more RAM, but for most of the things I or any average home user do with it, the original configuration would still be completely fine: watching Netflix, doing some casual surfing, a bit of home office stuff - you get the point.
This hardware longevity is common knowledge to any non-gamer who bought a home computer during the last ten years, because speed improvements nowadays aren't as noticeable. We've grown accustomed to certain things happening instantly, such as displaying a full screen (1080p) JPEG and things can't go much faster from there. In 1994, most home users would have to wait at the very least a couple of seconds for that to happen - and back then a "full screen" was much smaller.
Apply the same eight year span two decades ago and you'd be trying to play MP3s and shop on Amazon with a 7 MHz, 1 meg Amiga 600, or comparing a 4 meg, 33 MHz 486 (an expensive and luxurious consumer machine in 1992) to something like a 256 meg, 1 GHz Pentium III - a machine perfectly capable of decoding and displaying a DivX movie, with CPU cycles left to spare for web surfing while doing so.
A plethora of platforms
There were certainly people who bought a C64 in 1982 or an Amiga in 1985 and, like the aforementioned Amstrad PCW users, stuck with it for a decade or more before upgrading, much like there are people today who still swear by their Thinkpads from 2010. Others were on a regular upgrade schedule, perhaps roughly a four to five year cycle, which is roughly about the same as today.
The key to the bizarre 1990s diversity was that the developments in computing produced much greater differences in those four year cycles. Nowadays even a decade old computer is a multi-gigahertz, multi-gigabyte 64 bit machine with at least CD quality sound and a 24-bit megapixel display.
In the 1990s, we were still working our way to this kind of performance, and massive jumps in technology was what created such an eclectic mix of computer systems and platforms: someone who bought a used C64 in 1990 could still be using it in 1995. Through the ingenuity of Novaterm, described above, they could dial in to a UNIX computer and use the Internet in much the same way anyone else could, apart from seeing images on web pages - which wasn't as big of a loss back then as it is now. The net was still mostly plain text, because that was the only way to be sure everyone could use it.
This meant that in 1995, someone studying at a university could be using a 100 MHz, 64 meg, 64-bit RISC machine to chat on IRC with someone running a 1 MHz, 64 kilobyte, 8-bit machine: having a mere C64 for everyday computing in 1995 was uncommon, but far more common than the 64-bit workstation.
For a young nerd, it was heavenly times. There was an endless supply of platforms to investigate, operating systems to learn and software to experiment with. The past and the future linked together in a complete chain, because all of the machines were available and in use in some form - from the 8-bit breadbins all the way to the 64-bit monsters by way of ZX Spectrums, Atari ST:s, Amiga 1200:s and Pentium PC:s.
Programmers talking shop could mean anything from discussing the finer points of garbage collection in high level scripting languages, to discussing self-modifying assembly code and optimizing charset graphics on a C64. The latter certainly wasn't outdated knowledge; the Game Boy Advance released in 2001 had 32 kilobytes of RAM and still used tilesets for graphics.
Linux was gaining traction, to the point that it was already doing a lot of heavy lifting on Internet servers. The Amiga was dead, but there was hope for the future, a hope shared by Atari fans (How wrong we were!). Apple users were slightly more cocky - they had the PowerMac to gloat about, but the company was hanging on by the fingernails, aching to be saved by the second coming of Jobs. Companies like Digital, Sun, SGI and MIPS were still driving innovation, producing machines with that extra little bit of oomph or reliability needed for certain workloads.
But it was also the time when the PC finally caught up with the competition. It was already a popular platform because the open and modular architecture meant it was rapidly getting cheaper, narrowing the gap between price and performance at breakneck speed. Legacy remnants in the architecture were problematic and made it an unsavory platform to a lot of other home computer users. What good is a 100 MHz 486 CPU if the only reasonable operating system is the arcane, single tasking MS-DOS?
This changed overnight with Windows 95. Microsoft had managed to combine a multitasking GUI with DOS (and thus games) compatibility on a platform that was now finally fast and cheap enough to cope with the overhead of running Windows.
Death and rebirth
Eventually, the PC ate all the other platforms except the Mac. Most other home computers relied on custom hardware - in-house silicon to which both the OS and third party software was closely tied. The advantage of vertical integration and designing a whole platform from the ground up was surpassed by the fierce competition between PC component manufacturers.
The last sigh of the custom consumer platform was the BeBox, introduced in 1995. Aimed at tinkerers and hobbyists, it was far too expensive and it sold less than 2000 units in all.
In 1994, 3dfx began laying the foundation for the modern gaming GPU industry and the Playstation forever changed console gaming from pixel-based 2D tilesets to polygon-based 3D graphics, definitively sealing this as the future of gaming. The custom hardware home computers simply couldn't keep up in the price department and scrapping the old hardware also meant scrapping backwards compatibility with huge amounts of software. It was a dead end, and the PC filled the market gap.
This meant that home and office computing started converging in earnest. Both gamers, tinkerers and office drones had little sense if they invested in anything but a PC with Windows 95. Combined with ever cheaper Internet access, the line between home and office computer use was further blurred - and the things we used computers for. The previously timeless platforms, like the PCW and the C64, were gradually turning unusable because they couldn't keep up with increasing demands on connectivity and software support. By 1998, an up to date web browser was a killer application.
Professional UNIX workstations managed to hang on a while longer, mostly because the industry was a bit slow to react and they could trade on their names and specific software titles not available for other platforms. By 1998, when I landed my first IT job, it was easy to see which way things were heading. There was one Sun Ultra workstation and one SGI Indy at the office. They were still machines with an undisputed air of cool, and their respective users were suitably proud of their platforms. In reality, though, the performance was merely comparable to (or worse than) the Pentium II Linux boxes the rest of us used - and those cost a fraction to buy and maintain. The PC on my office desk was now identical to the one I used at home.
Some platforms lived on as legacy systems in niche fields, but with further advancements and ensuing incompatibility in surrounding technology, they too disappeared. By the time the new millennium came knocking, you were increasingly unlikely to find an Atari in a recording studio, or an Amiga, Quantel Paintbox or even SGI O2 at a TV station.
Eventually, even Apple made the switch and architectures other than Intel were demoted to a life in appliances: set top boxes, industrial automation, phones, game consoles. Twenty years of "home computers" were followed by twenty years of merely "computers".
And now, ironically, that appliance architecture - perfected in smartphones and IoT devices during two decades - is taking center stage in desktop machines with the advent of the Raspberry Pi 400 in the lower end of the spectrum and Apple's new ARM64 machines in the higher end - arguably running on "custom hardware".
x86 is no longer unthreatened as the default platform. With increased awareness about telemetry and user tracking, it's not unlikely that we might pick a type of computer to use in private that differs significantly from the one we use at work.
Perhaps the home computer will rise again.