Friday, August 28, 2009

God Does Not Play Atari

Quantum mechanics is a field that's hit hard by questions of interpretation. On the one hand, at the center of nonrelativistic quantum mechanics lies an equation--the famous Schrödinger equation--that evolves with time in a way that's entirely deterministic. However, it describes systems with wave functions (the things you actually stick into the equation to see how they evolve) that allow systems to seemingly be in multiple different states at once ("superposition"). Of course, when we actually measure the system to see which state it's in we do find it to be in a single state. But the theory doesn't tell us which state we'll find it in, it merely gives a probabilistic picture that tells us there's some chance it'll be in this state, some chance it'll be in that state, and so on. Taken at face value, it seems as if the wave functions describing systems go from evolving deterministically in a superposition of states to mysteriously collapsing down to a single state through a hazily understood process ("measurement") that doesn't seem to have deterministic outcomes.

Certain members of the original generation of founders of quantum mechanics--Einstein being the most prominent--found this state of affairs unsatisfying. Einstein didn't believe this indeterministic outcome could really be the way the universe worked and famously wrote:


Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice. -- Albert Einstein, Letter to Max Born (4 December 1926)


Einstein and two other physicists famously came up with a "paradox" to show the absurdity of this indeterministic view of the universe called the EPR paradox. It's based on a phenomenon known as entanglement, in which the states of two particles are connected in some way. Here's a simple way of understanding the paradox: suppose you've got a pair of gloves. Clearly, if one glove is left-handed, the other must be right-handed. But suppose these are quantum gloves and thus, prior to measurement, they each exist in a superposition of their two possible states (left-handed and right-handed). If I were to measure one, there would be a 50/50 chance of finding that it's a left-handed or right-handed glove because, as far we can tell, it isn't in a particular state before I measure it. Suppose I drop both of my quantum gloves--unmeasured--into different boxes and ship them off to friends who are ten light years apart. When one of my friends opens her box, the quantum glove will "collapse" into a single state. Let's say she finds her glove to be right-handed. She thus immediately knows--because the gloves are a pair ("entangled")--that its heretofore unmeasured and uncollapsed partner glove must be left-handed. There can't be any doubt about what my second friend will find when he eventually gets around to opening his box: a left-handed glove. Now, Einstein had made his name in no small part by declaring that information can't travel between two points faster than the speed of light so he seemed to have spotted a problem: how does the second glove instantaneously know that it has to "pick" the left-handed state when my friend gets around to opening it?

Einstein believed this paradox showed that even before I put my quantum gloves in their respective boxes and shipped them off they each had instructions (unbeknownst to me) that told them which state to pick when they collapsed: they conspired together somehow before I separated them and shipped them light years away. Some physicists continued that spirit, attempting to develop pictures of quantum mechanics that contained "local hidden variables"--secret instructions that we're not privy to that tell a wave function how exactly to collapse when measurement happens. In the 1960s, a famous piece of mathematical physics called Bell's theorem (and subsequent related experimental work over the next two decades) showed that local hidden variables do not seem to be consistent with the way our universe works.

But some physicists still dream of finding an underlying determinism behind quantum mechanics. One of the more well known is Nobel prize winner Gerard 't Hooft; you can see a bit of that in this excerpt from a brief Physics World interview with him from a few years ago (a handful of physicists were asked to discuss the current state of quantum theory):

Quantum mechanics could well relate to micro-physics the same way that thermodynamics relates to molecular physics: it is formally correct, but it may well be possible to devise deterministic laws at the micro scale. However, many researchers say that the mathematical nature of quantum mechanics does not allow this - a claim deduced from what are known as "Bell inequalities". In 1964 John Bell showed that a deterministic theory should, under all circumstances, obey mathematical inequalities that are actually violated by the quantum laws.

This contradiction, however, arises if one assumes that the particles we talk about, and their properties, are real, existing entities. But if we assume that objects are only real if they have been precisely defined, including all oscillations as small as the Planck scale - and that only our measurements of the properties of particles are real - then there is no blatant contradiction. One might assume that all macroscopic phenomena, such as particle positions, momenta, spins and energies, relate to microscopic variables in the same way thermodynamic concepts such as entropy and temperature relate to local, mechanical variables. Particles, and their properties, are not (or not entirely) real in the ontological sense. The only realities in this theory are the things that happen at the Planck scale. The things we call particles are chaotic oscillations of these Planckian quantities. What exactly these Planckian degrees of freedom are, however, remains a mystery.


What am I building up to? Well, the physics arXiv blog has a good find this week: Gerard 't Hooft has a paper up on the arXiv (an archive of preprints of scientific papers) this week suggesting that the universe acts like a cellular automaton. If you're not familiar with what that is, wiki is always a handy companion. Essentially it's a grid of cells that have certain rules--dependent on the states of neighboring cells--for taking a certain state (e.g. "on" or "off"--black or white). You can set up an initial state and then the grid evolves with time. Have a look at the most famous cellular automaton, Conway's Game of Life (not to be confused with Conway's Irish Ale).



Kind of reminds you of an Atari game (indeed, some Atari games were influenced by cellular automata). So what does the arXiv blogger have to say about this paper? Here's a teaser (read the rest if you're interested):

How Entanglement Could Be Deterministic

A Nobel Prize-winning physicist has developed a model of the universe as a cellular automaton that allows entanglement to be deterministic.

The universe is cellular automaton in which reality is simply the readout of a giant, fantastically complex computing machine. That's the conclusion of Nobel Prize-winning physicist Gerard 't Hooft, who says this also means that quantum mechanics is a deterministic theory.

The key new feature of this deterministic model is that it specifically allows for the quantum phenomenon of entanglement.


Is Gerard 't Hooft's quest for determinism futile or is he on to something with his latest ideas? Does God play dice or does He play Atari? I don't know but 't Hooft's certainly an interesting guy worth hearing out.

No comments:

Post a Comment