fbpx

What does “random” really mean anyway? (JIM AL-KHALILI)

What does “random” really mean anyway? (JIM AL-KHALILI)

Let us take a more careful look at the Second Law and the issue of order and disorder, for we have yet to get to the bottom of what entropy really means. In the example of the shuffling of a deck of cards, there seemed to be little doubt that the entropy of an ordered pack, in which all cards were arranged in suits by ascending value, was low, and that a randomly shuffled pack had higher entropy. But what if the pack consisted of just two cards? Since there are now just two possible ways to arrange the cards, it doesn’t make sense to distinguish between a less and a more ordered arrangement. What about three cards, say the two, three, and four of hearts? Well, you might well say that the sequence “two, three, four” is more ordered and hence of lower entropy than, say, the sequence “four, two, three.” After all, the first sequence has them in ascending order. But what if the three cards were all twos, of hearts, diamonds, and spades? Is one arrangement now any more ordered than the others? All that is different now is that the cards are defined by suit rather than value. Surely the way we label the cards cannot have a bearing on how much entropy there is? The sequence “two of hearts, two of diamonds, two of spades” has no more, or less, entropy than the sequence “two of diamonds, two of hearts, two of spades.”

It would seem that our definition of entropy as the amount of disorder is somewhat lacking since our definition of disorder is too narrow. It is obvious what we mean in some cases but not in others. Let me push this argument further. Here is a really bad card trick that demonstrates what I mean. I take an ordered pack of cards and shuffle it to reveal to you that the cards are now well and truly mixed up. Now, watch this, I say. I carry out what looks for all intents and purposes like a further normal shuffle. But, I claim, I have now placed the cards in a very special arrangement. This is an impressive claim, since it looked as though I was carrying out a similar shuffling action to the one that initially mixed up the pack. I turn the pack over and spread it out on the table. To your surprise and ill-concealed disappointment the cards look just as randomly jumbled up as they did earlier. This is certainly not what you would call a “special arrangement,” you argue.

Ah, but it is. You see, I could now bet you any money you could not take another pack of cards and shuffle it to produce exactly the same ordering as mine. The chances of your being able to do this are of course just as remote as they would be if I had asked you to take a shuffled pack and get it back to being completely ordered through shuffling alone. And the chances of doing that are about one in a hundred million trillion trillion trillion trillion trillion trillion. Basically, don’t bother trying. So, looking at it this way we see that my randomly ordered sequence of cards is just as “special” as a new, unshuffled pack. What of entropy now, then? It seems we cannot claim that entropy has increased if we end up in just as unlikely an arrangement as we started with, however randomly mixed up it looks.

Actually, and I am sure you must realize I am trying to pull a fast one here, there is of course something more special about the ordered pack than my “special” arrangement of randomly distributed cards. It comes down to entropy being a measure of randomness rather than disorder. This might seem just a play on words, but it does in fact give us a tighter definition of entropy. Technically, the term used to measure relative levels of “special-ness” is “algorithmic randomness.”

The word “algorithm” is used in computing to denote a sequence of instructions in a computer program, and algorithmic randomness is defined as the length of the shortest program that can instruct the computer to reproduce a given arrangement of the cards (or sequence of numbers). Thus, for the earlier example with just three cards, clearly reproducing the “two, three, four” arrangement requires the instruction: “arrange from smallest to largest,” whereas that for the “four, two, three” arrangement might go something like “start with the largest number, then put in increasing size,” in which case it is just as easy to spell it out: “start with the four, then the two then the three.” Either way, these commands have slightly higher algorithmic randomness than the first one, and so the arrangement “four, two, three” has slightly higher entropy than “two, three, four.”

This becomes much clearer when we have the whole 52-card pack. It is relatively easy to instruct a computer to reproduce the ordered pack: “Start with the hearts and arrange cards in ascending order, ace high, then do the same for diamonds, clubs, and spades.” But how would you program a computer to reproduce my special arrangement of shuffled cards? There is now really no shortcut and the instructions may have to be laid out explicitly, step by step: “start with the king of clubs, followed by the two of diamonds then the seven of hearts [and so on].” If the deck is not maximally disordered, there may be short sequences of unshuffled cards in which the original order has been preserved and which provides a saving on program length—for instance if the two, three, four, five, and six of spades are still together, then it is easier to instruct the computer to “start with the two of spades and arrange in ascending order for the next four in same suit” than to spell out each of the four cards.

Talking about the length of computer programs may not mean much to you, and we can in fact dispense with that way of defining algorithmic randomness. Since our brains, just like the brain of Maxwell’s demon, are, at their most basic, no more than computers running instructions, we can replace the notion of a computer algorithm with our memorizing ability. If I were to present you with a randomly shuffled deck of cards and then asked you to turn your back and arrange them into suits and ascending order, this instruction is so special and simple that you could do it easily. (Note that I am allowing you to turn the cards faceup and sort them properly rather than rely on blind, random shuffling to achieve this by chance.) But if I were to ask you to arrange your pack in the same order as my “special” arrangement, which I had arrived at by random shuffling, you would probably find it near-impossible to memorize the order of the cards before you turned your back to replicate it with the pack in your hands.

Basically, you need a lot more information to reproduce the arrangement of the pack now than you did before. The more information you have about a system, the more you will be able to order it and lower its entropy.

 

 

 

Paradox

Jim Al-Khalili



Facebook

Instagram

Follow Me on Instagram