Artemy Lebedev
§ 81. The life and extraordinary adventures of a typographical pointJanuary 22, 2002 

What should one know about a point? The word stems from the Latin punctum. Point is a unit of the typographical measurement system—typometry. Before the typographic point was invented, font sizes were differentiated by their names. Say, “cicero” (12 points) was so named because Cicero’s works were first printed in 1467 using this font size. 
The term typometry stems from the name of typometer—a device (originally named “prototype”) invented by Fournier 

The idea of fontsize standardization dates back to the 17th century, but the first easytohandle typographic point was proposed only in 1737 by the French printer and typefounder Pierre Simon Fournier. According to his system, every fontsize was equal to a certain number of points: nonpareil—6, petit—8 etc. 

In France of that period a common linear measurement unit was toise, which was equal to 6 Royal feet (pied de roi). The foot was equal to 12 inches; the inch was divided into 12 lines; the line—into 12 metric points. The length of two of these points was adopted as one typographic point in a booklet published by Fournier. However, we’ve seen plenty of times before that everything in human history has consistently been done in a shitty, screwedup way. For some reason, by the time Fournier’s twovolume manual was published in 1764, the newlyborn typographic point proved shorter than the point of 1737, thus becoming an arbitrary unit. 

Housekeeping tip 
Any measurement system must have a standard. Say the acre is the area of ground tillable by two oxen in one day. Zero Fahrenheit is the freezing point of a saturated salt solution (or water with sal ammoniac, according to other sources). The Russian arsheen was equaled to 28 English inches by a decree of the Russian tsar Peter the Great. The English inch is equal to the width of the thumb on a man’s palm, or more precisely, three barley grains arranged in a line. In today’s world all nonmetric units (all those yards and furlongs, ounces and bushels) are defined based on metric parameters. Until 1964 the meter was defined in the USA as 39.37 inches. But now the English inch is precisely defined as being equal to 2.54 cm. The French inch—2.706 cm. Unfortunately, there’s no way of saying today how tall Thumbelina basically was. 

Nevertheless, due to its simplicity, Fournier’s “spoilt” system became a standard. Later (in 1783) the system’s inaccuracies were corrected by another French— FrançoisAmbroise Didot, a wellknown printer. More specifically, he equalled the typographic point not to an arbitrary, but a mathematically precise value—1/6 of a line, taking the same French Royal foot (324.84 mm) as a basis. Simply put, he returned to the value originally proposed by Fournier in 1737. 

But as early as 1795 the French introduced the metric system, turfing all Royal feet along with the “standard” basis of the typographic point. 

In 1879 the USA started to widely use a point devised by Nelson C. Hawks who claimed authorship of the idea of the point itself. Hawks discovered that cicero (a 12point font) was equal to 1/6 of an inch (and we know that’s the way it should be). And he persuaded his boss, who owned one of the largest type foundries and was busy restoring his business after a great fire, to adopt the new system. 

Things went so far as adopting the Hawks point as a national standard. But the funny thing is that at that time the Association of Typefounders of the United States tied all measurement systems to the metric system. So it said that 83 cicero were equal to 35 centimeters (the point being equal to 0.3514 mm)—so much did they want to conform to the metric system. 
The United States introduced the Hawks point as a national standard in 1886, with Great Britain following suit several years later 

The same year (1879) Hermann Berthold, a German, translated the Didot point into the metric system: one meter held 2660 points. These days Germany, Eastern Europe and Russia use this point that’s defined in the metric system as 0.3759 mm and rounded off to 0.376 mm. 

And here’s what we get in the end (eat your hearts out, dear designers): 



It’s been a historical fact that none of the points is precisely equal to 1/72 of an inch. Many people tried to fit it on, but none of them could make it. But in the 1980s Adobe devised the PostScript language, in which the point was described as exactly one seventysecond of the English inch: 0.013(8)… 

From that time on, everybody can type fonts on a computer with mindbending precision. 
