One of my colleagues raves about David Lindley's Where Does the Weirdness Go? as a basic introduction to odd quantum effects, but somehow, I've never managed to get around to reading any of his books until now. I recently had a need to know a bit more about the historical development of quantum theory, though, and ran across Lindley's **Uncertainty: Einstein, Heisenberg, Bohr and the Struggle for the Soul of Science** in the library, which promised to contain the information I was after, so I checked it out.

As you can guess from the title, the book deals with the early development of quantum mechanics, and is particularly concerned with the philosophical issues and personalities involved in Heisenberg's Uncertainty Principle. The early chapters dip back into the very early history of quantum theory, but it mostly deals with the period from around 1915-1940, during which the study of atoms forced physicists to complete overturn classical intuition about how the world works.

Lindley does a very good job of presenting a non-technical history of the muddled period of the "old quantum theory," between the development of the Bohr model of hydrogen and the development of recognizably modern quantum mechanics. This is a period that most modern physics books skip over lightly, mostly because it's such an unholy mess-- the idea of discrete quantum states was fairly well accepted, but had no obvious physical basis, and people were attempting to explain atomic structure with a dizzying variety of ad hoc rules and mostly arbitrary quantum numbers. The scientific history presented here gives some sense of the confusion, without actually being confusing, which is a significant accomplishment.

Where the book really shines, though, is in presenting the personalities of the people involved in the making of the theory. Lindley includes brief character sketches of all the important players, and enough anecdotes to give a good sense of what they were like. You get the maddeningly evasive and philosophical Bohr, the standoffish Heisenberg, the prickly Max Born. There are also nice portraits of Einstein as a cranky conservative, Schrödinger the utter cad (unsurprisingly, he got on well with Einstein), and the extremely sarcastic Pauli. Pretty much anyone who's anyone in the history of quantum theory shows up, and they all get their due.

Of course, aas you can tell from the subtitle, the book also delves into the philosophical issues surrounding the rise of quantum theory. It's a tad melodramatic to call this a "Struggle for the Soul of Science," but it's not a hgue stretch.

One of the things that Lindley does a nice job of conveying is just how radical Heisenberg's approach to quantum theory was-- he was pretty much ready to discard the entire picture of commonly understood physical reality, and deal solely in observables. He wasn't concerned with the "real" structure of atoms, just what you could measure about them-- in a sense, he was the earliest proponent of the "shut up and calculate" interpretation. This was tremendously disturbing in a philosophical sense, and probably played a role in the success of Schrödinger's wave equation over Heisenberg's matrix mechanics.

Interestingly, though, Lindley also makes clear that the battle over the philosophy and interpretation of quantum theory was not nearly as central as some treatments make it out to be. In his description, the epic debates between Einstein and Bohr at the Solvay conferences are a bit of a side show. Most of the younger physicists at the meetings more or less ignored them, in favor of detailed discussions of what could be accomplished with the new theory. This probably had something to do with the unholy mess that is the Copenhagen Interpretation becoming the accepted picture of things-- Bohr cared deeply about this stuff, where many others did not, and thus he got to define the way people thought about interpreting the theory.

Inevitably, a book dealing with uncertainty will need to deal with popular conceptions of the Uncertainty Principle, and I particularly like what Lindley has to say here:

[E]ven in physics, the uncertainty principle is by no means of ever-present relevance. The whole point of Bohr's program of complementarity was to help physicists handle the evident fact that the real world, the world of observations and phenomena in which we live seems to be pretty solid

despitethe fact that underneath it all lies the peculiar indeterminacy of quantum mechanics. If Heisenberg's principle doesn't enter all the often into the thinking of the average physicist, how can it be important for journalism, or critical theory, or the writing of television screenplays?We already

knowthat people act awkwardly in front of cameras, that they don't tell their stories to a newspaper reporter the same way they would tell them to a friend. Weknowthat an anthropologist dropping in on some remote village culture becomes the focus of attention and has trouble seeing people behave as they normally would. Weknowthat a poem or a piece of music doesn't mean the same thing to all readers and listeners.The invocation of Heisenberg's name doesn't make these commonplace ideas any easier to understand, for the simple reason that they're perfectly easy to understand in the first place. What fascinates, evidently, is the semblance of a connection, and underlying commonality, between scientific and other forms of knowledge. we return, in this roundabout way, to D. H. Lawrence's jibe about relativity and quantum theory-- that he liked them precisely because they apparently blunted the hard edge of scientific objectivity and truth. We don't have to be as intellectually philistine as Lawrence to see the attraction here. Perhaps the scientific way of knowing, in the post-Heisenberg world, is not as forbidding as it once seemed.

I have a few minor quibbles about the presentation of the science-- it leans a bit too heavily on the "observing a system changes the system" view of uncertainty, which isn't quite right-- but all in all, this is an excellent book. The science is clearly described and broadly accessible, and the personalities of the physicists involved come through with impressive clarity. It's almost compulsively readable, too-- I didn't get around to picking it up until two days before it was due back at the library, but I had no trouble finishing it in time to return it. I'll probably buy a copy the next time I hit Borders, because there are a few sections I'd like to refer back to, and that's great praise for a popular physics book.

- Log in to post comments

Thanks for the review and recommendation. I'm teaching a new liberal arts course called "The Quantum Universe" in the Spring, and I'm always on the lookout for a no-fluff introduction to QM for non-scientists, and I think the stories & characters will help to draw students in.

On to the Amazon wish list it goes.

I really enjoyed

Where Does the Weirdness Go?. I've used the section on the development of and evidence for the photon model to help some of my high school students over the whole "wave-particle duality" hump, where their brains usually get stuck. It helps that there's nary a formula in sight - they get symbol-shy after a while."Where Does the Weirdness Go" is the quantum mechanics pop-sci book i turn to first to refresh my memory on some of the subtleties of quantum mechanics. I may have to check out "Uncertainty" now as well!

I long ago coined the term "proto-quantum mechanics" for the hybrid of classical and quantum ideas started by Bohr. Since it is not a self-consistent mechanics, and is flawed right at the beginning with an L=1 ground state, I dislike even hinting that it is "old" quantum mechanics. Its entire role was to stimulate the experiments that proved it wrong, driving Heisenberg and then Schroedinger to invent a distinct and unique theory to replace it. I also always point out that there are no "quantum jumps" in QM, because transitions only occur when states spatially overlap.

I'll also comment that "observing the system changes it" is simply wrong about the fuzziness of the states. (Unfocused is a better version of it than uncertainty, since there is also nothing uncertain about the very definite probability distributions predicted by QM.) I think it was Cassidy's book that points out that Heisenberg made an error in his paper, caught by Bohr in a preprint, and the "microscope" example was deleted by a note added in proof. That flawed idea lives on, however.

I thought it was generally accepted that matrix mechanics was not widely adopted because most physicists at the time knew little about infinite dimensional matrices but they could all solve the wave equation.

If that isn't right, what is? I just finished a junior level Intro to QM (using Griffiths' book) and I still don't know.

I can understand how the observer effect would work: A hydrogen atom is about 1 Angstrom wide, so you need a 1 Angstrom wave to look at it. But a 1A photon has an energy of about 12400 ev, and the electron is bound with only 13.6 ev.

But I also know that the observer effect is not enough to explain stuff like quantum interference, but nobody ever explained what does. Does anybody know?

Andy, go to your library and find the book by Cassidy. The title is "Uncertainty: The Life and Science of Werner Heisenberg". That explains it in more detail than is possible here for the specific case of Heisenberg's microscope analogy. It also goes into the positivist approach Heisenberg took in inventing QM that Chad alludes to above.

But ... consider your (incomplete) example of the "observer effect". I assume you argue that you can't image it because you destroyed it, but that is irrelevant. You can do a coincidence experiment where you measure the energy and momentum of the outgoing electron as well as any outgoing photon (and, I suppose, if you work at it, the recoiling proton for redundancy). That allows you to infer the original momentum of the bound electron in addition to whatever you learned about its position from your projectile. Sure, you change the system, but you can measure the change so that is not a problem.

Indeed, people do this kind of experiment for real. The H(e,2e) experiments show clearly that the momentum of the bound electron has a very definite probability distribution. The fuzziness of these states is intrinsic, not related to the measurement, and (if you use Born's interpretation of Schroedinger's wave function) is precisely predicted by the theory.

Today, a BEC allows you to see this macroscopically. When I first saw the picture (and I mean picture) of two BEC clouds interfering, I was speechless.

PS - And Adam, use 0.1 nm (and hc = 1270 eV*nm). The Angstrom has been out of date longer than you have been alive.

I don't know; I've been around since 1959 :) (Angstroms were in use when I was an actual undergrad; at least in our textbooks. That's why I'm used to them. My recent QM course was taken during time off from work. Praise be to Univ of Washington that lets dilettantes like me waltz in and take a class or two.)

I wrote:

-- it leans a bit too heavily on the "observing a system changes the system" view of uncertainty, which isn't quite right--andy wrote:

If that isn't right, what is? I just finished a junior level Intro to QM (using Griffiths' book) and I still don't know.The problem with over-reliance on "observing changes the state" is that it creates the impression in many students that it's just a matter of finding a more clever way of doing the measurement. Some really smart people have fallen into this over the years-- Einstein, for one-- but there's no way around it.

On a fundamental level, the uncertainty principle arises from the fact that certain pairs of physical quantities are defined in such a way that measuring them imposes contradictory requirements on the system. The concrete example I use in introducing the idea is that the position-momentum uncertainty relationship is a consequence of the wave nature of matter.

The position of a particle in quantum mechanics is described by a wavefunction, that's related to the probability, while the momentum is defined by the wavelength associated with that particle. Thus, if you want to define both the position and momentum very well, you want a wavefunction that has both a well-defined position and a well-defined wavelength. And you can't do both of those at the same time.

You can make a spiky sort of probability distribution that has a single sharp peak at one well-defined position, but when you look at that function, there's no way to associate a wavelength with it. And you can make a beautiful sinusoidal wave with a well-defined wavelength, but by definition, that will extend over all space, and then there's no clear way to define the position. The best you can do is a sort of wave packet, where you have some oscillation in one region of space, and nothing outside of it, but that will give you some uncertainty in both the position and the wavelength. If you play around with different mathematical functions, you'll find that the absolute best you can do at minimizing uncertainty in both position and momentum is the lower limit of the Heisenberg uncertainty principle.

(More formally, the wavefunction expressed in terms of the momentum and the wavefunction expressed in terms of the position are related by a Fourier transform, so making one narrower necessarily makes the other wider.)

It's not simply a matter of being unable to extract information from the physical system, then. The information isn't there in the first place-- there

isno well-defined position or momentum of a quantum system.On the units issue, angstroms are still kicking. In AMO physics, you'll still find plenty of people who express wavelengths in angstroms. Bohr radii are another common one, and a Bohr radius, as we all know, is about half an angstrom...

I probably should have raised these issues in class, but the foundations of QM doesn't seem to be very big deal; and you don't really need it to use QM to solve problems.

The reason I keep thinking about it is that I can't

visualizewhat's going on at the atomic level.Maybe another beer will help...

Years before the term "Nanotechnology" was popularized by K. Eric Drexler, the proposed name for the field was Angstromics.