Quantum mechanics is the theory that describes the workings of the microscopic world—by which I do not mean the tiny world only visible under a microscope, but rather the far, far tinier world of atoms and molecules and the subatomic particles (the electrons, protons, and neutrons) of which they are made up. Indeed, quantum mechanics is the most powerful, important, and fundamental mathematical set of ideas in the whole of science. It is remarkable for two seemingly contradictory reasons (almost a paradox in itself!). On the one hand, it is so fundamental to our understanding of the workings of our world that it lies at the very heart of most of the technological advances made in the past half-century. On the other hand, no one seems to know exactly what it means.

I must make it clear from the outset that the theory of quantum mechanics is not in itself weird or illogical; on the contrary, it is a beautifully accurate and logical construction that describes nature superbly well. Without it we would not be able to understand the basics of modern chemistry, or electronics, or materials science; we would not have invented the silicon chip or the laser; there would be no television sets, computers, microwaves, CD and DVD players, or mobile phones, let alone much else that we take for granted in our technological age.

Quantum mechanics accurately predicts and explains the behavior of the very building blocks of matter with extraordinary accuracy. It has led us to a very precise and almost complete understanding of how the subatomic world behaves, and how the myriad of particles interact with each other and connect up to form the world we see around us, and of which we are of course a part. After all, we are ultimately just a collection of trillions of atoms obeying the rules of quantum mechanics and organized in a highly complex way.

These strange mathematical rules were discovered in the 1920S. They turn out to be very different from the rules that govern the more mundane everyday world we are familiar with: the world of objects we see around us. Near the end of the book I will explore just how strange some of these rules are when we consider the Paradox of Schrödinger’s Cat. For now, I wish to focus on one particularly strange feature of the quantum world, namely that an atom will behave differently if left to its own devices from how it behaves when it is being “observed”—by which I mean when it is being monitored in some way: poked or prodded, knocked or zapped. This feature of the quantum world is still not fully understood, partly because it is only now becoming clear what exactly constitutes “observation” in this sense. This is known as the “measurement problem” and is still an active area of scientific research today.

The quantum world is ruled by chance and probability. It is a place where nothing is as it seems. If left alone, a radioactive atom will emit a particle, but we are unable to predict when this will take place. All we can ever do is work out a number called the half-life. This is the time it takes for half of a large number of identical atoms to “decay” radioactively. The larger the number, the more accurate we can be about this half-life, but we can never predict in advance which atom in the sample will go next. It is very much like the statistics of tossing a coin. We know that if we toss a coin again and again, then half the time it will end up heads and the other half tails. The more times we toss it, the more accurate this statistical prediction will be. But we can never predict whether the very next toss of the coin will be heads or tails.

The quantum world is probabilistic in nature not because quantum mechanics as a theory is incomplete or approximate, but rather because the atom itself does not “know” when this random event will take place. This is an example of what is called “indeterminism,” or unpredictability.

Misra and Sudarshan’s paper, which was published in the Journal of Mathematical Physics, describes the astonishing situation whereby a radioactive atom, if observed closely and continuously, will never decay! The idea can be summed up perfectly by the adage “the watched pot never boils,” which was first used, as far as I can tell, by the Victorian writer Elizabeth Gaskell in her 1848 novel Mary Barton—although it is the sort of saying that probably dates back much further. The notion has its origins, of course, in Zeno’s Arrow Paradox and our inability to detect motion by considering a snapshot of a moving object in an instant of time.

But how and why might this happen in reality? Clearly the saying about the watched pot is nothing more than a simple lesson in patience: you cannot make a kettle boil any more quickly by staring at it. However, Misra and Sudarshan seemed to be suggesting that when it comes to atoms you really do influence how they behave by watching them. What is more, this interference is unavoidable—the act of looking will inevitably alter the state of the thing you are looking at.

Their idea goes to the very heart of how quantum mechanics describes the microscopic world: as a fuzzy, ghostly reality in which all sorts of weird goings-on seem to take place routinely when it’s left alone—an idea we will revisit in Chapter 9—none of which we are ever able to detect actually happening. So an atom that would, if left to its own devices, spontaneously emit a particle at any moment will somehow remain too shy to do so if it is being spied upon, so that we can never actually catch it in the act. It is as though the atom has been endowed with some kind of awareness, which is crazy. But then the quantum world is crazy. One of the founding fathers of quantum theory was the Danish physicist Niels Bohr, who in 1920 set up a research institute in Copenhagen to which he attracted the greatest scientific geniuses of the time—men such as Werner Heisenberg, Wolfgang Pauli, and Erwin Schrödinger—to try to unlock the secrets of the tiniest building blocks of nature. One of Bohr’s most famous sayings was that “if you are not astonished by the conclusions of quantum mechanics then you have not understood it.”

Misra and Sudarshan’s paper was entitled “The Zeno’s Paradox in Quantum Theory” because of its origins in the Arrow Paradox. However, it is fair to say now that, while its conclusion remains somewhat controversial, it is for most quantum physicists no longer a paradox. In the literature today it is referred to more commonly as the “Quantum Zeno Effect,” and has been found to apply far more widely than in the situation described by Misra and Sudarshan. A quantum physicist will happily tell you that the effect can be explained by “the constant collapse of the wave function into the initial undecayed state,” which is the sort of incomprehensible geeky gobbledygook one should expect from such people—I should know, I am one of them. But I don’t think I will pursue this line of thought in any more detail here, just in case you are nervously wondering what you’ve let yourself in for.

This recent discovery that the Quantum Zeno Effect is pretty much ubiquitous is down to a better understanding among quantum physicists of how an atom responds to its surroundings. A big breakthrough was made when scientists at one of the world’s most prestigious laboratories, the National Institute of Standards and Technology in Colorado, confirmed the Quantum Zeno Effect in a famous experiment in 1990. The experiment took place within the wonderfully named Division of Time and Frequency, which is best known for setting the standards for the most accurate measurement of time. Indeed, scientists here have recently built the world’s most accurate atomic clock, precise to within one second every three and a half billion years—that’s getting close to the age of the Earth itself.

One of the physicists working on these incredibly high-precision clocks is Wayne Itano. It was his group who designed the experiment to test whether the Quantum Zeno Effect could really be detected. It involved trapping several thousand atoms in a magnetic field and then zapping them delicately with lasers, forcing them to give up their secrets. Sure enough, the researchers found clear evidence of the Quantum Zeno Effect: under constant watching, the atoms behaved very differently from what the scientists had expected.

One final twist: there is now evidence for the opposite effect, something called the “Anti-Zeno Effect,” which is the quantum equivalent of staring at a kettle and making it come to the boil more quickly. While still somewhat speculative, such research goes to the heart of some of the most profound and possibly important areas of science in the twenty-first century, such as working toward building what is a called a quantum computer. This is a device that makes direct use of some of the strange behavior of the quantum world in order to carry out its calculations far more efficiently.

I am not sure what Zeno of Elea would have made of this revival of his paradoxes, or of his name being attached to a remarkable phenomenon in physics nearly two and a half thousand years later. Here, the paradox has nothing to do with tricks of logic, but everything to do with the even stranger tricks that nature seems able to play down at the tiny scale of atoms—tricks that we are only beginning to understand.





Quantum: A Guide For The Perplexed

Jim Al-Khalili



Follow Me on Instagram