Claude Shannon’s landmark 1948 paper was like a lens: it moved “information” from the background to the foreground of history. By popularizing both the term and the concept, he made it possible to see that the genes of a single celled organism, the beats of African drummers, and a one-bit message manifested as one or two lanterns hung in the window of the Old North Church were all instances of the same phenomenon.

Shannon’s insights showed the linkages between what became known as the information science: math, communication, electrical engineering, computer science, psychology, and physics. These fields provide leverage on each other. Viewing genes as information means that a computer scientist can offer insights, such as the error-correcting mechanism of RNA. In turn, a biologist can offer communication theorists the concept of a meme. Advances in one field can also be counterproductive in others, though, such as by abstracting away the physical aspects of computation. (Quantum information theory has begun to bring back in the truism that information is inevitably physical.)

Gleick’s treatment of the history of information is enjoyable throughout, and full of interesting bits of trivia. For example, as with so many things, the development of Morse code is less straightforward than it is often told in hindsight. It started as a mapping from numbers to words. Morse’s great insight was to use shorter dot-dash combinations for more frequent letters. To determine which letters were used most often, he examined their distribution in the type cases at a newspaper (a use of statistical data, one means of communication learning from another).

The connection between technologies of communication and transportation is apparent throughout the book. After all, for most of history these amounted to the same thing. The speed of trains made apparent the existence of time zones. To keep track of locomotive schedules and keep them from colliding a faster means of communication was needed, so telegraph lines sprung up coextensively with train tracks. Later on, airplanes moved faster than ballistics calculations could be done and required second-order differential equations and noise filtering for radar data, resulting in another close linkage between transportation and communication.

The economics of transportation may also apply to communication as well. When transportation is expensive, value-added activities tend to be performed before goods or shipped. In the 18th century farmers turned their corn into whiskey to make shipment cheaper and more lucrative. Economists today know this as the Alchian-Allen theorem, discussed in more detail here and here.

When storage and transportation is cheap things of speculative value tend to be stored, such as Silicon Valley startups hoarding data about their users and the predominance of server-side computation. Apple is already starting to question this by doing distributed computation on devices. Wikipedia similarly challenges the notion of information scarcity, as embodied in their reminder that “wiki is not paper.”

Living in an age of abundant data, it is hard to fathom that for most of human history a table of numbers would have been meaningless. Data begets analytical methods begets insight. The cycle continues.

There are several other insights offered that I will not explore fully here. These include:

  • Redundancy as a means of ensuring fidelity of information in a noisy channel. Think of Homer’s “wine dark sea” in its original oral transmission, or pilot’s use of “bravo” and “victor” (vs. “bee” and “vee”).
  • An introduction to Charles Babbage’s early entries into mechanical computing, and the deep connection between computational methods and time (ch. 4).
  • Many instances of simultaneous invention: Newton and Leibniz, Boole and De Morgan, Elisha Gray and Alexander Graham Bell

Further reading: