Artemy Lebedev

§ 70. Screen resolution. And a little about the origin of 72 pixels per inch

June 24, 2001




Now what do you think—does a computer know the current screen resolution (in pixels per inch)? It doesn’t have an idea. Neither do the monitor and the graphics card, so it’s no use asking them.



If the computer calculated the resolution in pixels per inch based on the height, width and diagonal of the screen, the resulting value would be a relative one. Because in any CRT monitor the beam may be reduced to the form that would differ from the one suggested by the graphics software. The solution to this problem lies in some indefinite future, and is only applicable to digital displays with a panel of a certain size.



If you wonder what your screen resolution is, use the calculator:




Screen resolution in pixels: ×

Monitor diagonal in inches:

Approximate resolution in pixels per inch:

For those with a TFT display: to achieve the best result, you should find an inch ruler and put it diagonally to the screen. The matter is that your display’s declared diagonal is in fact an inch or two longer than what it actually is (after the examination, 21″ displays may easily prove to have 19″ in them).



Let us backtrack to the mysterious figures that we talked about in the previous section. Where do the 72 and 96 pixels per inch come from? Macintosh displays were traditionally supplied with a preset resolution that could not be changed. Apple’s original assumption was that WYSIWYG was possible with a resolution of 72 pixels per inch, and changing this resolution was prohibited. In the PC world, on the contrary, monitors traditionally had adjustable resolution that was supported on the software level. Then Microsoft figured that the 1024×768 resolution may perfectly fit a 14-inch screen. The 96 pixels/inch resolution (30% higher than Macintosh’s) was selected in a hit-and-miss manner.



By the mid-1990s Apple had evolved far enough to gain an insight that the industry had been making big strides; that 72 pixels in one linear inch were no longer enough, because monitors cost money, and the number of consumers willing to fork out a couple of thousand for a 21″ Mac display that performs on a par with a 17″ PC display had dwindled to a trickle.



Owing to these scientifically unsound and low-tech musings by the giants of computer building the user wound up in a ridiculous situation. Since the computer knows nothing about screen resolution, an operating system gets to play a guessing game, literally. MacOS makes an assumption that the resolution is 72 pixels per inch, while Windows assumes that an inch holds 96 pixels.



You might be thinking, “What the hell!”, but you better not. That very assumption of resolution embedded in the system (whose value can be changed in Windows, but an ordinary user would never get to it in a lifetime) directly impacts the size of screen fonts. A 72-pixel font (nearly one inch) will spread over 72 pixels on a Mac, and over 96 pixels on a PC (one third more than an inch).




12-point text on a Macintosh display


12-point text on a PC display



But these days we all have the same monitors and the same resolutions. Little wonder that all PC users can’t see why a 12-point text is so good, after all—it’s 16 points (pixels) high on their screens. What does it mean in the end? The near-ubiquitous <font size=-1> on websites. And Macintosh users are wracking their brains over how a 4-pixel high text can possibly be readable.



Images are everywhere displayed similarly, because their size is in pixels. But they won’t be before long. To learn why resolution is bound to start playing a bigger part in the future, and why pixel graphics are doomed to die, see the next section.







Share this page:

E-mail: mailbox@artlebedev.com
Contact information Telephone: (495) 540-18-00
 
© 1995–2017 Art. Lebedev Studio