Fonts, a Fuzzy Fringefest
Raster fonts, vector typefaces, smoothing and legibility
Even though we've long since moved into the realm of video conferencing and movie streaming, computers are still very much about reading and producing text: emails, chat messages, documents, web pages and code. When rendering text on computers, there are two commonly noticeable aspects: fidelity and legibility.
With fidelity - for lack of a better term - I refer to pure aesthetics. This represents how faithfully a typeface is translated to pixels with regards to concepts like kerning, ligatures, alignment and thickness - in other words, how closely the typeface on screen visually resembles what its designer envisioned.
Legibility is, of course, how easy the text is to read on screen. This is somewhat connected to fidelity, but not entirely. A perfectly rendered typeface can have abysmal legibility if it wasn't properly designed - or purposely designed to be illegible. Sadly, it seems that legibility is gradually decreasing at the expense of trying (though not always succeeding) to increase fidelity.
A Brief History of Computer Typography
Rasters of the Universe
Originally, all typefaces were primarily designed for print media, but the ubiquity of computers has gradually changed this: many typefaces are now designed with computer screens as their target platform. The first were of course the raster fonts used in CRT terminals such as the DEC VT-52 and the ones built into the ROM or BIOS of early home computers. These raster or bitmap fonts were constructed pixel by pixel, often for a fixed screen resolution such as 640x200 pixels. They came in one or maybe a handful of sizes, each determined by how much pixel real estate an individual character required to be displayed.
In the case of these early bitmap fonts, fidelity wasn't much of an issue: there was only so many ways you could vary a character in, say, an 8x8 pixel grid and maintain high legibility. Typographical concepts like kerning were mostly pointless, not least because all characters had the same fixed width, and their primary function was to convey information even under very dire technical circumstances. Examples are Commodore's Topaz and IBM's CGA font - both designed to be readable on low quality CRT screens connected to the computer via composite or even an RF modulator.
Commodore's Topaz font in all its glory. This is from version 1.x of AmigaOS - in version 2.0, Topaz got a makeover and looks slightly less ancient and terrible. Highly legible on a 14" CRT TV, though!
Technology, as we all know, progressed rather swiftly and computers were soon used for things like WYSIWYG word processing, a task that requires not only legibility but also a modicum of fidelity when rendering text. On early home computers this was achieved by providing a fixed set of fonts for a given typeface, where each font size had been meticulously designed to be legible on screen while retaining as much fidelity as possible. But - and this is important - legibility always trumped fidelity!
Helvetica 24 px from the X11 classic fonts collection.
Pixel Art Techniques
It was also around this time that home computers had become powerful enough, and decent quality CRT screens cheap enough, that visual pizzazz - if not exactly font fidelity - was an important factor when creating, for example, video game graphics. Hardware limitations weren't as restricting anymore, but they were still there, and techniques such as dithering (mixing pixels of two different colors to ostensibly produce a new one) and anti-aliasing (smoothing pixelated edges by using carefully positioned intermittent pixels of an intermediate value between two contrasting colour areas) were now commonplace. Dithering and anti-aliasing were combined with classic trompe l'oeil techniques (a one pixel gray line looks thinner than a one pixel white line) which together with the innate fuzziness of CRT:s effectively simulated a higher screen resolution.
An example of a pixel art logo with no anti-aliasing (top) and manual anti-aliasing (bottom). Even with just a few extra hues, carefully placed pixels can make quite a difference.
Scaling, Hinting and Smoothing
A problem with the fixed size raster font was that they looked terrible when printed, since it was basically a dump of blocky pixels onto paper. Furthermore, the fonts didn't scale - if you wanted a font size between the ones supplied, the program either wouldn't let you select it or completely mangled a smaller or bigger font size into a terrible mess of stretched and squished pixels.
Enter scalable vector fonts - a set of mathematically defined curves and lines that can scale to any size and looks tremendously good when printed. Except they weren't always as hot on screen, because computer displays were still fairly limited and 640x480 pixels (or less) doesn't translate these curves as well as paper does even in the cheapest of inkjet printers. Especially in smaller sizes, typically the ones used to write letters and essays, legibility was suffering immensely.
Better rendering of scalable typefaces was solved by hinting - basically, a self-contained set of rules for each vector typeface on how it should be translated to a raster font. This mostly meant that a handful of fonts were meticulously hinted to look good and retain high legibility both on screen and in print. A prime example is Arial, licensed to Microsoft in 1990. This and Microsoft's Times New Roman - also with exceptional hinting - made up the bulk of scalable fonts most casual computer users came in contact with for the better part of a decade.
A poorly hinted vector font compared to a carefully designed bitmap font.
With the advent of the web, Microsoft released their core fonts for the web, which brought the world modern classics such as Verdana, Trebuchet MS and meme font par excellence, Impact. Like Arial before them, these fonts were carefully designed to be highly legible on screen, even in small sizes, and, like Arial, featured hinting so carefully designed they may as well have been raster fonts to begin with.
This was also the time when computers had become powerful enough to apply anti-aliasing to text on the fly, something previously only done algorithmically - and rather slowly - in expensive video titling and paint programs.
Initially, this automatic anti-aliasing was only done for fonts rendered above a certain size. While anti-aliasing carefully made by hand in a bitmap paint program can look good on fonts as small as six or seven pixels high, the algorithms used to perform on-the-fly rendering weren't practically applicable to most smaller fonts: the text sizes typically used by web pages at the time risked becoming too blurry and thus illegible.
The market then gradually shifted away from CRT monitors in favor of LCD flat panel screens. Microsoft introduced ClearType, a technique utilizing the physical nature of LCD panels to provide subpixel smoothing, allowing maximum fidelity with retained legibility even when smoothing the edges of smaller font size - or so it would seem.
Present Day Dilemmas
The rationale behind ClearType and subpixel smoothing in general is good: cleverly design software to utilize an artefact of hardware in order to produce a better user experience. As explained on the Wikipedia page, subpixel smoothing is based on the fact that sharp changes in color brightness is more noticeable than rather than subtle changes in color value.
Except to me it isn't, anymore.
When ClearType was first launched, I was a big fan and I've been using it for several years, up until 2019 or so, when I started running Linux again after a long hiatus. I started using Linux on a Raspberry Pi, and swiftly installed xterm, because I like it, and - just for a laugh - the old X11 bitmap fonts. It turned out that I soon started preferring those bitmap fonts when using the terminal and when I switched to Linux on my main PC as well, something seemed off with the smoothed truetype fonts that came in the default terminal setup. I couldn't quite focus on the text, it seemed - my eyes just glossed over it, unable to find footing. I soon switched to bitmap fonts and kept on using them in the terminal only. When I bought a 14" laptop to replace my 15" one, I started getting some minor joint pains and decided to hook the laptop up to a 24" external screen.
I now immediately noticed that I couldn't really focus on the text in my browser anymore and, furthermore, the colour fringing caused by Freetype's subpixel smoothing was extremely noticeable, to the point that some letters looked blue and some looked red - sometimes right next to each other. Once I had seen this, it seems as if I couldn't unsee it, and it spilled over to the ClearType rendering of fonts on my Windows computer at work.
I don't have a good explanation for why I suddenly started noticing subpixel fringing, but I think several factors may be at play:
- Age - I'm getting older and my vision is changing.
- Screen Quality - Early LCD flatscreens often had a certain paleness about them - poor contrast and washed out colours, perfect for displaying text smoothed with ClearType. Modern screens have better contrast and brightness and are geared more towards entertainment (gaming and movies), and must thus be able to represent both vivid colours and subtle changes in hue.
- Pixels Per Inch - Early consumer flatscreens usually had 17" 5:4 panels with a 1280x1024 pixel resolution, giving a higher PPI count than common consumer models today that tends to feature at least a 24" panel, but with a comparably low 1920x1080 pixel resolution. Early 16:9 screens were often 1600x900 on a 19" panel, giving them the same 96 PPI density as the aforementioned 17" screen.
- Subtlety - Traditional algorithmic anti-aliasing used to be much more subtle than it is today.
Magnified screenshots of Ubuntu version 7 and BeOS version 5, showcasing subtle algorithmic anti-aliasing. The lightness of BeOS' algorithm allows anti-aliasing of very small font sizes without affecting legibility. Notice how straight glyphs, such as lowercase L, are completely unaffected by both algorithms.
ClearType and BlurType
With the advent of subpixel smoothing, anti-aliased text became the norm. Systems not using subpixel smoothing - even Windows with ClearType disabled - now typically smooths glyph edges in all font sizes using very aggressive anti-aliasing, producing text that looks as if it's been run through a Gaussian blur filter rather than something intended for a human being to read.
Above is a comparison of normal (non-subpixel) anti-aliasing of text and the same text rendered without any smoothing at all. The font is Arial, which has superb hinting. Notice how how even perfectly straight glyphs, such as lowercase L, are mangled into a thick, blurry mess by the anti-aliasing. Click the image to see it in full resolution.
No Config, Again
None of this should be much of a problem, really. On my home computer I run Linux and a very eclectic selection of applications, which means I've simply turned all font smoothing off completely and only use bitmap fonts or fonts with excellent hinting. Firefox still lets me override CSS fonts with my own selection, which means I can read all web pages in a crisp, extremely legible Arial.
But there are several caveats and more clouds forming on the horizon.
One is the disappearance of basic configuration options, both in operating systems and individual applications. ClearType can still be disabled in Windows, which then falls back to the blurfest anti-aliasing mentioned above. I have to use third party software to bypass this, which only solves part of the problem. The next hurdle is that I can no longer select the system font in Windows - a setting that has been available for decades has suddenly been removed - and must resort to registry hacks to replace the decently hinted Segoe UI with the expertly hinted Arial.
With Microsoft's ClearType tuner and a decent-PPI screen, such as a 27" panel with a 2560x1440 native resolution, I can achieve acceptable results where fringing isn't as noticeable, and VS Code lets me use non-smoothed fonts such as the eminent TTF version of Terminus when writing code.
However, neither applications nor Windows itself is consistent in how and where ClearType and traditional anti-aliasing is applied. As seen in the screenshot below, several applications, including Windows' own settings app, don't use ClearType, rendering the tuning effectively pointless. Combined with various programs where it's downright impossible to change the UI font in any way, disabling all anti-aliasing will produce extremely poor font rendering. Hinting, it seems, is a lost art: even fonts explicitly designed for writing code on screen, such as Go Mono, suffer from terrible (or completely missing) hinting information, making them unbearably ugly and almost completely illegible without some kind of font smoothing enabled.
ClearType and traditional anti-aliasing mixed in two different Windows applications, as seen side by side on the same machine at the same time. Click the image to see it in full resolution. Notice how two identical glyphs right next to each other, such as "ff" in "turn off", are anti-aliased completely differently.
You'd think that an application designed for text chatting, such as Discord, would let you change the font you read all the messages in to one of your liking, but no. With font smoothing disabled, it looks as below.
Hinted typefaces and font configurability are apparently, according to Discord, not core features of a text chat client.
Pipe Dreams of a Better Situation
One solution could of course be to get a 4K screen, but that's not a viable option to everyone - cost, desk space, application and OS compatibility or an employer that won't (or can't, depending on semiconductor availability) upgrade are all very real factors at play here. The solution to the legibility problem is, in fact, very simple: Control of typeface rendering in software should return to the levels of quality we saw in the late 1990:s and early 2000:s.
This is also, sadly, why my hopes of this happening any time soon aren't very high.
Microsoft are continually locking Windows down, removing configurability and changing settings on a scale that makes even Mozilla green with envy. Linux software and distros are gradually removing support for bitmap fonts and gearing up to switch to Wayland, which is a piece of Modern Software with Modern Sensibilities, of which disabling font smoothing probably isn't one in the long run.
Whatever the future holds, I'm sure I'll be reading about it while squinting.