I'm teaching our sophomore-level modern physics course this term, which goes by the title "Relativity, Quantum Mechanics, and Their Applications." The first mid-term was a couple of weeks ago, on Relativity (special, not general), and the second mid-term is tomorrow, on Quantum Mechanics, and then we get three weeks of applications (basically, whatever topics out of atomic, molecular, solid state, nuclear, and particle physics I can manage to fit in).
I like to end the quantum section with one lecture on superposition and measurement, which isn't covered particularly well in the book. It's not really surprising, since it's a topic that's really hard to understand, and also doesn't lend itself to plug-and-chug numerical problems. It's disappointing, though, because it's some of the coolest material in physics, and I think they ought to at least hear about it once.
So, I compromise by lecturing about it in the class immediately before the exam. It doesn't add anything significant to the material they need to know for the exam, but it gets the topic in there, so I feel better. I don't know how much they get out of it-- there were a couple of wide-eyed "you can't be serious" expressions yesterday, but also a few students who looked like they'd rather be anywhere else than listening to that lecture.
Anyway, this will probably form the basis for my Boskone talk on Sunday, after some judicious editing. Thus, I will reproduce the main points below the fold here, to see how they play in a different context.
I'm going to do this in quasi-outline form, rather than trying to polish it all up into a fluid and seamless essay (maybe some other time):
I) When I talk about QM, I harp on what I refer to as four central principles of quantum theory: 1) That everything is described by a wavefunction, 2) That wavefunctions tell you the probability of finding certain experimental results, 3) That any system will have a limited set of possible states, and will always be found in one of these eigenstates, and 4) that measurement changes the wavefunction of a system.
A lot of people think that the probability thing is the really weird part of quantum mechanics, but I don't really think that's the case. I mean, sure, it's very upsetting if you're a big believer in the clockwork universe, but in our limited-precision reality, a responsible approach to classical physics demands that we deal in probability distributions. There's always some uncertainty in the magnitude and direction of a particle's velocity, which means that any prediction of its future position is necessarily uncertain. All we can really say is that there's a very good probability of finding it within some range of positions at a later time.
This really isn't so different from the quantum situation. The only important difference is that classical physics holds out the possibility of shrinking the uncertainties to zero, in the spherical and frictionless world wherewe can make ideal measurements. Quantum theory explicitly rules that out, but that's more a philosophical loss than a practical one.
II) The weirdness of quantum theory, in my view, comes about because of what goes on before you get the probability distributions. Quantum probabilities behave in a dramatically different way than classical ones, and that's what gets to the heart of the weirdness.
The classic example of this sort of behavior is to consider the behavior of a double-slit experiment from both classical and quantum perspectives. So, imagine that we have a source of some sort of projectiles on one side of a wall (say, Dick Cheney with a shotgun), and some sort of detector on the other side (say, a millionaire Texas lawyer). If we put two narrow gaps into the wall, and move our detector back and forth, we can calculate a probability distribution describing the odds of the lawyer getting a face full of birdshot at any position on the far side of the wall.
If we use classical birdshot, we find more or less what you'd expect: A distribution with two lumps corresponding to the two gaps in the wall. If we make the gaps close together, the lumps will overlap a bit, but what you get will always be just the sum of the distributions for each gap by itself.
If we lay off poor Dick Cheney for a bit, and imagine doing the same thing with a set of quantum particles, say a beam of electrons hitting a thin foil with two slits cut in it, we find something completely different. Rather than two lumps, corresponding to the two slits, we find a rapidly oscillating pattern of bright and dark stripes, that are much narrower than the single-slit distribution. This is interference, a signature of wave-like behavior on the part of the electrons, and is completely non-classical.
III) Mathematically, the interference comes about because the wavefunction is not the same as the probability distribution. In order to obtain probabilities from wavefunctions, we need to multiply the wavefunction by its complex conjugate, which is roughly analagous to squaring the wavefunction.
In the classical system, when we put two slits together, we just add the probability distributions. In the quantum case, we add the wavefunctions, and then square to get the probability distribution. When we do this, we pick up some extra terms in the distribution, that don't have a classical analogue. These can be positive or negative, and are what give rise to the interference pattern-- at some points where we would classically expect a high probability of finding a particle, we end up with no probability at all, while at others, the probability is enhanced.
(There's a bunch of math here in the lecture version, that I'm leaving out for this post.)
IV) So, what does it mean to say that the wavefunction on the far side of the barrier is the sum of the wavefunctions for the two individual slits? Well, the wavefunction describes the probability of finding a particle at some position after passing through a given slit, so having the wavefunction be the sum of the two single-slit wavefunctions means that there's a probability of the particle passing through either slit.
But it's a stronger statement than that-- it's not just that the particle might've gone through either slit, it's that the particle went through both slits, at the same time. The state on the far side is a superposition state, describing a particle that has taken two different paths at the same time.
We can verify this by doing the experiment with single particles, and what we see is exactly the predicition of quantum theory. If we send one electron at a time toward a set of slits, and detect the electron position on the far side, we see individual electrons arriving one at a time, in an apparently random pattern. If we repeat this many, many times, and add up the results, though, we recover the interference pattern. While the individual particles show up as discrete spots, there are places in the pattern where the probability of finding an electron is absolutely zero, and other spots where the probability is quite high. The electrons are detected as individual particles, but their arrangement shows signs that they were in a superposition state along the way-- the interference shows that each electron took two different paths to detection point.
(There will be more to this, obviously, but I need to get to work, and I'd like to see if there are comments/ complaints/ suggestions about this section, so I'll post it first. I'll get to the measurement/ interpretation stuff tomorrow.)
I am not a physicist, nor do I play one at work. However (I have commented this way before so it seems to me that I am harping), it seems to me that one of the reasons quantum physics seems so strange is that when we think of things like electrons as things like birdshot, we expect them to behave like birdshot. But, in fact, things like electrons are not particles like birdshot in the intuitive way we think of particles. They are a different type of thing that we have no first-hand, macroscopic experience with. If we saw on a quantum-physical scale, we would have an intuitive understanding of the phenomena and they wouldn't seem so strange. Thus, to me, the weirdness comes from expecting a thing to behave like the analogue we use to describe it (Wave or particle? Particle or wave?). It's comes down to sematics, but I think we should stress that quantum-scale objects at times behave like particles or waves, not that sometimes they are like particles or waves.
The experiment that made me a Believer was the classic optical polarization film. "Okay, so that's all the vertically polarized light, and that's all the light from that can get through a 45 degree angle, and that's all the vert-- WTF?!" There's just no way to reconcile that with classical intuition, and it's more in-my-face than the double slit experiments. It's more visceral.
What really drives home the fundamental weirdness of it at a more intellectual level, though, is the quantum computation. The polarization is visceral, but the computation almost seems like it's changing the laws of mathematics if I think about it long enough. Unfortunately, it takes a broader background than sophomores are likely to have, to appreciate that.
"[T]hat measurement changes the wavefunction of the system..." delights and weirds me too. The action at distance, causal violation stuff is fantastic in every sense of the word. I'm about to get my B.S. Chemistry (older guy though, who went back to school when I.T. went south), then onto a PChem grad program. I'm just starting to really mess with Schrodinger, and its a mind-bender alright, but I'm hooked.
I love the Cheney reference to illuminate the double slit, it should play well. Frankly the double slit experiment is the heart of quantum weirdness for me, was it Feynman who said that sooner or later it always cmes down to the double slit experiment?
There is a good reason why measurement is glossed over in QM books and lectures. It is a very weak point of the theory.
CanuckRob: I love the Cheney reference to illuminate the double slit, it should play well.
I made that joke in class, and it got a laugh. I was a little surprised that they were aware of it (college students can be surprisingly insulated from reality), but I always like it when my jokes go over well...
Roman: There is a good reason why measurement is glossed over in QM books and lectures. It is a very weak point of the theory.
But that doesn't mean it isn't fascinating. In fact, part of the reason why it's one of the coolest things going is that the theory is so sketchy...
I like the simple, transparent explanations, but I want to ask about two points:
I) Could you add a fifth fundamental tenant to your four central principles, namely, that quantum gives us an absolute definition for "large" and "small?" Unlike the in a classical world, we have a metric - Planck's constant - which divides phenomena into quasiclassical and quantized realms. There is no classical analogy.
II) The minute impact of uncertainty in typical measurements need not be the measure of its fundamental significance. There's a big difference between being uncertain about a datum, and indeterminacy in that datum. Our world has a "grain size" to it, which reveals - and conceals - much about its structure and function. Sort of like the implications of not being able to travel at a million miles a second, or to cool something to a thousand degrees below zero.
nelson oliver: I) Could you add a fifth fundamental tenant to your four central principles, namely, that quantum gives us an absolute definition for "large" and "small?" Unlike the in a classical world, we have a metric - Planck's constant - which divides phenomena into quasiclassical and quantized realms. There is no classical analogy.
I don't know that I would really call that fundamental in the same sense as the others. I mean, in principle, even macroscopic objects are described by quantum wavefunctions, and have quantized energy states, and the like. The states get really, really close together, and a continuous approximation is pretty good, but there isn't fundamentally a sharp break.
I would take out the 'quantum', and just call it
Phenomenacial-physics, cuz basically it's an subwannbeconscious theory/philosophy of the probable cause.
I'am the void, and so is I.