It seems like I've been waiting all my computing life for VDUs to exceed 200 DPI. Well, that's an exaggeration. I've been waiting for it for about as long as I was first exposed to system-wide vector-based type rendering, in the late 1980s. So I'm understandably excited about Apple's new "retina" MacBook Pro, with it's display of ~220 DPI.
Why care so much about DPI? It's all about the text, in particular the inherent problems with clearly scaling non-rectilinear strokes. Text is the fundamental component of everything I do with computers. It always has been, and it seems likely that it will long continue to be. As a floppy haired, slack-wristed aesthete, I really care that the text, which I will be staring into for hours, is clear and beautiful.
The LCD screens used for most modern displays are constructed from a mesh of tiny discrete transparent shutters, which work in combination to make up pixels, which are the smallest visual element that can be addressed on a bitmap display. Typically these pixels are nearly square, and they are arranged in a 2D matrix of perhaps a few million elements. That may sound like a lot, but it's coarse enough to introduce perceptible distortion into lines that are not perfectly rectilinear.
One of my favourite things about Mac OS X, and it's upstart little brother, iOS, is the respect their type-generating software applies to letterform. Typefaces render very faithfully, regardless of scale, and pains are taken to smooth out the curves, using anti-aliasing techniques, that detect the staircasing edges of lines, and soften them into their background with gradual shading. This works very well, but it's not un-noticeable; there's a soft-focus effect that gives a fringey halo to certain text shapes; you become inured to it over time. Other GUI systems tend to adjust the letterform to make the text better align to the pixel grid, it's common for people who aren't habituated to the Mac to comment about the degree of blur.
Things are much better than they used to be. Way back in the day, when outline curved rendering was just too computationally expensive to be routine, everything on-screen was painted as a copy of a pre-drawn bitmap, and blocky graphics were everywhere, particularly once scaling and translation was applied. We peered at them on our tiny goldfish-bowl CRT monitors. Outline font rendering was a specialist feature of certain software packages or dedicated computer systems, perhaps not even rendered online. The fanciest workstation computers had gigantic 20" CRTs, and all vector graphical engines like Display PostScript. It seemed reasonable then to expect the exponential improvements in technology to scale this up to at least print-quality DPI, and the costs to come down.
The costs did come down, and the computers continued their frantic pace of improvement, but something appeared to lock mainstream display rendering at somewhere around 100 DPI for over a decade. I think it was a combination of factors.
There was the move away bulky from beam scanning phosphor dot CRT monitors, which are theoretically capable of precise drawing at a perfectly graduated range of resolutions, over to the more space and power efficient LCD displays, with the aforementioned discrete physical pixel elements. Fifteen years ago I had a 19" ADI multisync CRT monitor, and the effective resolution of my computer display crept up as I upgraded my graphics card and display, and the monitor kept pace. For the last ten years, I've been using a nice 23" HP widescreen LCD, and my desktop resolution has been locked at 1920×1200 that corresponds to the mechanical pixel array of my screen.
LCD screen technology manufacturing is closely tied to flatscreen television production, where the standard vertical resolution has settled on 1080 pixels, which is marketed as 'High Definition' which is actually pretty low definition if you stop to think that cheap desktop computers were routinely rendering higher than that years before its roll-out.
The system software used on desktop computers, made optimisations and took short-cuts based on the average dot pitch, using fixed bitmaps for painting GUI elements, making assumptions about proportions and spacing of on-screen elements that entrenched and subsequently proved remarkably hard to shift.
The turning point seems to have come with the iPhone 4, and it's "Retina" display, with a DPI count of 326 – close to that of low-grade print– on it's highly saturated backlit LCD screen. Text looks fantastic on this generation of iPhone, still to me the nicest display of this type I've seen. This was followed up by the slightly coarser (264 DPI) Retina iPad model a couple of years later, with a and as of last week, the still slightly astonishing Retina MacBook Pro. Seems like the high DPI era I've been waiting for is here!
And yet I'm not going to buy a Retina MacBook Pro. I did give it some excited thought. I rushed right out to Apple Covent Garden after the announcement, and fondled one for a little bit, and decided it's not really for me. Experience has taught me to steer wide of a 1st iteration Mac Platform, especially one where Apple seems to be pushing the hardware design into some advanced new shape. There's often early adopter trouble. A couple of early warning signals jump out at me from the start. Pushing that many pixels around is really going to need some grunt work. I have my suspicions about cooling; why the big air vents down the side, why devote five minutes of the keynote describing a cunning new fan design? It's a Mac, I want no fans. Steve always wanted No Fans. It's too big and heavy for me, and yes of course, it's really expensive.
I ordered a new generation 13" MacBook Air. It will replace my current laptop, a last generation 13" MacBook Air. Which replaced my previous laptop, a 13" MacBook Air from the year before. Seems I have a MacBook Air habit.
The wedge-shaped MacBook Air is iterating rapidly to converge upon my ideal computer. Light enough to move around without becoming a burden. A full scale keyboard that I enjoy typing upon, as an emacs-wedded touch typist prone to RSI. Enough pixels on the screen to productively juggle the magical 3 window pattern I tend to adopt for work (an editing window, a reference window, and a command shell). Enough power that I don't need to worry about where my next charge point is. And the 13" display has fairly small pixels (~128 DPI). Smaller text isn't as legible as I'd like, mind you, and some of the GUI elements are a bit small. It would be nice to have more CPU cores. Like I say, iterating rapidly…
200+ DPI displays are clearly here to stay. Where Apple plant their flag, all the OEM PC hardware makers ineveitably follow. Microsoft Windows, which to me increasingly looks like it's playing catch-up, seems to me, looking from the outside, to be more completely resolution independent than either of Apple's operating systems at this point in time, so that shouldn't be a hold-up to broader deployment any more. Production will simplify. Costs will fall with scale.
I had been planning on buying a nice external display, probably an Apple Thunderbolt, because they make lovely docking stations for Thunderbolt-equipped laptops, but that's a foolish idea now. It seems sensible to bet that there will be a high-DPI equivalent along within a couple of years, and monitors are a long term investment. I can wait.
We seem to be at something of a transitional phase for the personal computer at the moment. It seems likely that the future of the Mac is some kind of convergence point between the iPad, the retina MacBook Pro and the MacBook Air, but I can't quite figure out what shape that thing will take. I am typing this final sentence on my box-fresh, just powered up, 2012 MacBook Air, with it's new Mac smell, and it's LCD screen cleaner than I will ever be able to polish it; already I am day-dreaming about it's replacement.