That the earth cirlces around the sun was bad enough, but the real catastrophe confronting the meaning of existence and the existence of meaning was Darwin. Here we are by happenstance, it turns out, the sons and daughters of survivalism and sexuality. There was all of a sudden only one thing to set us apart from the roughly 8.7 million other million species on this planet: our ability to process, document, and share information.

But what is information?

Only now do I appreciate the exceeding difficulty of answering what seems like such a straightforward question. You don’t know it when you see it; you don’t see it at all. Even my usually authoritative-sounding Oxford English Dictionary comes up pathetically short: “what is conveyed or represented by a particular arrangement or sequence of things.”1

In interviews James Gleick has said that this is a book he has wanted to write nearly his entire life. The sweeping bibliography and index justify the delay. A deep analysis of something as abstract as ‘information’ leaves the reader staring often at the ceiling, trying to make sense of the perpetually perplexing.2 Can information exist without communication? Is there such a thing as knowledge, or is some information merely more useful in particular contexts? Is genetic information really information — based on bits, communicated by computers? Can we eventually capture all of the genetic information of the entire biosphere on a single hard drive? And, if so, is that all we are, a humble collection of bits and algorithms? If those algorithms are pre-decided, then is our destiny as well?

The book begins with a quote from Zadie Smith, and it is quite possibly the most unremarkable paragraph she has ever written; except that, when placed in this new context, it speaks directly to an inquietude that now overwhelms us all:

Anyway, those tickets, the old ones, they didn’t tell you where you were going, much less where you came from. He couldn’t remember seeing any dates on them, either, and there was certainly no mention of time. It was all different now, of course. All this information. Archie wondered why that was.

Give that information (sometimes referred to as ‘culture’, ‘facts’, ‘intelligence’, ‘data’, ‘meaning’, ‘dogma’ and ‘knowledge’) is what distinguishes humans from our primate predecessors, it is incredible that there was no technical definition at all until 1949 when Claude Shannon published “A Mathematical Theory of Communication,” giving rise to the theoretical fields of information theory and Information Science, but also to practical engineering approaches to storing information on magnetic tape and communicating it across networks.


As the subtitle discloses from the get-go, this is a book that treats its topic from three different angles. First we are given an overview of the history of information before we knew how to define it. Here there are African drums that talk, optical telegraph towers constructed across France at the height of the revolution, and Samuel Morse’s electric telegraph and famous code.

Next we delve into theory, specifically information theory and its various applications in communication, biology, physics, chemistry and quantum mechanics. At times Gleick seems to exuberantly make the case for information theory as a Theory of Everything.3 This was most economically and mystically expressed in 1989 by John Wheller: “It from bit.” Information is physical, it must be stored on tangible objects, and must therefore obey the laws of physics. “To do anything requires energy. To specify what is done requires information.”

If all of this sounds increasingly abstract, it only becomes more so, until the theory eventually reaches an apex of abstraction: quantum computing, the fundamentals of which remain beyond my cerebral grasp. For me, the most startling and intriguing ramification of information theory is that information is a measure of probability. 1010101010 contains less information than 1010100011, despite their same quantity of digits, as the former can be expressed by “repeat 10 five times” whereas the latter is seemingly random. The significance here is that information, randomness, complexity, and computability are four different ways to express the same principle; namely, that ‘information’ refers to that which we cannot predict.

And then I stare at the ceiling.

There are increasing accusations against information theory. It has overstepped its application, some say, limiting our understanding of natural processes by viewing everything through a metaphor of bits. As early as 2000 Paul E Griffiths called genetic information “a metaphor in search of a theory.” Gleick devotes an entire chapter to exploring the math behind claims that information theory is reductionist, beginning with Gödel’s notorious incompleteness theorems and ending with Laplace’s Demon. It is a frustrating chapter; just when the concept of information seems to make so much sense, it turns out not to.

And then I stare at the ceiling.

Finally, we enter the third section of the book, the flood. As information becomes exponentially cheaper to store, it becomes exponentially cheaper to “create.” Hence, as Google’s Eric Schmidt is eager to remind audiences, “every two days we now create as much information as we did up to 2003.” In Claude Shannon’s groundbreaking paper he estimated the size of the greatest store of information known to humanity at the time, the Library of Congress. Shannon estimated that it was probably around a terabyte. He was very close. Today CERN generates 1 petabyte of data per second — that’s 1,000 terabytes per second.

Gleick writes with sympathetic compassion toward those of us who are overwhelmed by the flood. Citing a beautiful essay by David Foster Wallace, he reminds us that making decisions depends on eliminating options, and that we now have more information, more options, to eliminate than ever. He cites psycho-neurological research that shows we often make worse decisions when confronted with more information, even if it is entirely relevant.

But in his review of The Information for the New York Review of Books, Freeman Dyson is decidedly less sympathetic toward the info-overwhelmed. Wikipedia is a gift of the flood, as is 21st century science.

Ultimately, we are reminded, information is a part of evolution, and it is now up to us to adapt or drown in the flood.


I only have one bone to pick with Gleick’s nearly masterful overview of information, and that is his silent transition from bits to meaning. On one page he is summarizing the difficulties of incorporating classical information theory into quantum computing, and the very next page he describes the birth of Wikipedia.

Wikipedians like to claim that they are organizing all of the world’s knowledge, but ‘knowledge’ turns out to be even more difficult to define than information. We know how to measure economic capital, but intellectual capital is a guessing game. Einstein’s theory of relativity probably contains more ‘knowledge’ than whoever came up with the idea to make a sandwich with peanut butter, honey, and banana — but only in certain circumstances. In other words, “meaning” refers to relevance, which is inherently subjective.

Gleick emphasizes from the very beginning — and throughout the book — that Shannon’s paper and the birth of information theory was only made possible by divorcing information from meaning. But he doesn’t even attempt to draw the fuzziest of lines between the two, though he does seem disposed to the human-centric view of Heinz von Foerster who argued at an early cybernetics conference that it was fundament able to distinguish between the “beep beeps” of information theory and the “process of understanding,” the decoding, that takes place in the human brain. To put it another way, “beauty is in the eye of the beholder, and information is in the head of the receiver.”


1 ☞ Apparently I don’t have the latest version of the OED; in a blog post for the New York Review of Books, Gleick notes that the latest entry for ‘information’ now runs 9,400 words and prompted an essay-length meditation by OED managing editor Michael Proffitt. After all, Gleick reminds us, OED is in the information business, like so many of us.

2 ☞ My intellectual insecurity was somewhat soothed when I saw that Cory Doctorow — voracious reader and perversely prolific writer — frequently “stopped reading it a lot … stopped to stare into space and go ‘huh’ and ‘wow’ and ‘huh’ again.”

3 ☞ “Why does nature appear quantized?” Gleick rhetorically asks before answering himself: “Because information is quantized. The bit is the ultimate, unsplittable particle.”