15 Dec Information is what our world runs on (JAMES GLEICK)
Information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.
“The information circle becomes the unit of life,” says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: “It connotes a cosmic principle of organization and order, and it provides an exact measure of that.” The gene has its cultural analog, too: the meme. In cultural evolution, a meme is a replicator and propagator—an idea, a fashion, a chain letter, or a conspiracy theory. On a bad day, a meme is a virus.
Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what.
And atoms? Matter has its own coinage, and the hardest science of all, physics, seemed to have reached maturity. But physics, too, finds itself sideswiped by a new intellectual model. In the years after World War II, the heyday of the physicists, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists focused their prestige and resources on the search for fundamental particles and the laws governing their interaction, the construction of giant accelerators and the discovery of quarks and gluons. From this exalted enterprise, the business of communications research could not have appeared further removed. At Bell Labs, Claude Shannon was not thinking about physics. Particle physicists did not need bits.
And then, all at once, they did. Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.” This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine.
A key to the enigma is a type of relationship that had no place in classical physics: the phenomenon known as entanglement. When particles or quantum systems are entangled, their properties remain correlated across vast distances and vast times. Light-years apart, they share something that is physical, yet not only physical. Spooky paradoxes arise, unresolvable until one understands how entanglement encodes information, measured in bits or their drolly named quantum counterpart, qubits.
When photons and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. The universe computes its own destiny. How much does it compute? How fast? How big is its total information capacity, its memory space? What is the link between energy and information; what is the energy cost of flipping a bit? These are hard questions, but they are not as mystical or metaphorical as they sound. Physicists and quantum information theorists, a new breed, struggle with them together. They do the math and produce tentative answers. (“The bit count of the cosmos, however it is figured, is ten raised to a very large power,” according to Wheeler. According to Seth Lloyd: “No more than 10120 ops on 1090 bits.”) They look anew at the mysteries of thermodynamic entropy and at those notorious information swallowers, black holes. “Tomorrow,” Wheeler declares, “we will have learned to understand and express all of physics in the language of information.”
As the role of information grows beyond anyone’s reckoning, it grows to be too much. “TMI,” people now say. We have information fatigue, anxiety, and glut. We have met the Devil of Information Overload and his impish underlings, the computer virus, the busy signal, the dead link, and the PowerPoint presentation. All this, too, is due in its roundabout way to Shannon. Everything changed so quickly. John Robinson Pierce (the Bell Labs engineer who had come up with the word transistor) mused afterward: “It is hard to picture the world before Shannon as it seemed to those who lived in it. It is difficult to recover innocence, ignorance, and lack of understanding.”
Yet the past does come back into focus. In the beginning was the word, according to John. We are the species that named itself Homo sapiens, the one who knows—and then, after reflection, amended that to Homo sapiens sapiens. The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory.” The alphabet was a founding technology of information. The telephone, the fax machine, the calculator, and, ultimately, the computer are only the latest innovations devised for saving, manipulating, and communicating knowledge.