Hearing The Uncertainty Principle

If you read about science at all, you've heard of Heisenberg's uncertainty principle. It's the canonical example of quantum weirdness, the strange idea that you can't simultaneously know the position and momentum of a particle. Pack a particle into a small enough box and your accurate knowledge of position will necessarily cause that particle to have a very uncertain momentum, "bouncing" around crazily inside that box.

What you may not have read is that this isn't just quantum weirdness, it happens just as often in the classical world of waves. In fact, the very fact that quantum particles have a wave nature creates the bridge between the weird quantum uncertainty principle and the classical uncertainty principle that you literally hear all the time.

Let's start off with a graph of a sine wave with a frequency of 1 Hz:

i-d4bb9fe09d6b78eb832985821d8a307f-1.png

There's two equivalent ways I can specify this wave. I can say, as I did, that it's a sine wave of frequency 1 Hz. Mathematically you could say I've given the spectrum of this wave, describing it in what we in the business call the frequency domain. (And phase, but that's beyond our scope for this post.) The other way I could specify this wave is in the time domain, where I just write down the function of t that describes the time behavior of the wave.

One representation is not more fundamental than the other. Given the frequency domain, it's possible to write down the one unique waveform of that frequency spectrum - which, for the record, is itself a function of ω, the frequency. Given the description in the time domain, I can calculate the frequency spectrum.

So what if I have a wave that looks like this?:

i-88516ca505d84f969b7ab15a965c3fb3-2.png

It's almost the same 1 Hz wave as above, but it's not the same. It's contained within a envelope, and only maintains that 1 Hz for a few cycles before dying away. Clearly in the frequency domain, "1 Hz" isn't exactly the right description because it has already been taken by that pure sine wave. So what's the right frequency representation?

We can calculate it by a method called the Fourier transform, named after the French mathematician Joseph Fourier. By doing this, we can see what the right frequency representation is. It's this:

i-d18063683dcc0d42b9be45451a84d1e3-3.png

Here the units are in Hz. We see that it's sharply peaked right at 1 Hz, which makes sense because it's mostly a 1 Hz signal. But because it's not exactly a pure sine wave, the sine wave in its envelope is actually a mixture of different frequencies spread near the 1 Hz mark.

What if we plot another 1 Hz wave, but force it into an even briefer envelope, like this?

i-8d7070ce28f0af28047c7a5fb8246d53-4.png

We can do the Fourier transform and see how it looks in the frequency domain:

i-bb5c2c6b0452df43a61e3974bd9b473f-5.png

Ah. It's even more spread out. Despite being in some sense a "1 Hz" pulse, it actually involves frequencies as low as 0.5 Hz and as high as 1.5 Hz. The general principle continues to hold as we compress the pulse more and more. Shorter pulses require a broader range of frequencies. In general, short in one domain means broad in the other. Very short pulses of sound contain such a broad range of frequencies that they can scarcely be said to constitute anything resembling a musical note at all.

Now that we've seen it, we ought to listen to it. Our 1 Hz example is far below the range of human hearing - or most speakers, for that matter - so we'll use a base frequency of 440 Hz. Here's a 440 Hz tone in an envelope that's 1/2 of a second long, measured from half-maximum on the upslope to half-maximum on the down:

It is a pulse of sound, but it's obviously a nearly pure note without a lot of frequency spread. What if we pack the same wave into a smaller envelope, one that's just 1/10 of a second long?

Shorter, but still clearly identifiable. Quite a few cycles - well, 44 of them - happen in that time, so it's not so shocking we can still pick out what the note is. But what if we make the envelope just 1/100 of a second long?

Hmm. At this point it's possible to calculate that the frequency spectrum is spread almost from 380 to 510 Hz. Compared with a piano keyboard, that's starting to spread into the frequency territories of the neighboring keys. When you listen to the note, it's possible to tell that the pitch is not very easily identifiable as one definite note. It's more of a "pop" with a hint of its original tone.

Finally at 1/500 of a second even the hint of the original tone is gone. As a musician to duplicate this note on a piano and they'll just shrug their shoulders:

The frequency content of that little pop spans several octaves. It no longer resembles a pure note at all. The large degree of certainty with respect to the time of the note means a large degree of uncertainty with respect to the frequency of the note.

It's the uncertainty principle and it's not even quantum!

More like this

Can this explanation be extended to the electromagnetic spectrum, specifically, radio frequencies? If I am understanding correctly, very short modulation pulses on a frequency modulated carrier should increase the needed bandwidth, and hence a reciever's need to hear a correspondingly wider range of frequencies. As I understand, higher frequencies are needed to effectively pack more information into a given bandwidth. Is this another facet to what you describe?

Thanks!

Excellent article.

So well written, even I could follow along - thanks for the clarity!

And so, signal analysis routinely uses the Wigner function. A nice review is Leon Cohen, Proceedings of the IEEE Volume 77, July 1989, pages 941-981, "Time-Frequency Distributions -- A Review". There is no quantum noise/fluctuations in your post, and there's none in the paper I cite above, so there's no Planck constant, which is, needless to say, a big difference.

Quantum fluctuations that are distinct from thermal fluctuations (quantum noise is Lorentz invariant, thermal noise is not; Planck's constant, with units of action, is a measure of the amplitude of quantum fluctuations, in contrast to kT, with units of energy, for the amplitude of thermal fluctuations at temperature T) can be added to classical stochastic signal processing models, in which case the mathematics gets so close that it's arguably the same: EPL 87 (2009) 31002 (doi: 10.1209/0295-5075/87/31002), arXiv:0905.1263v2 [quant-ph]. It's rather nicer to work with random and quantum fields than to go through the usual motions of trying to make classical particle models match up, somehow, with quantum mechanics.

You're exactly right, Blaine. Electromagnetic waves do the exact same thing. Each bit of transmitted information is going to correspond to some "feature" of the wave, with the details depending on the coding scheme and whether it's AM or FM and other particulars. If you want to pack those features together tightly, you have to make them small in the time domain. For that, you need a broad bandwidth.

I had a post a while back along those lines, which you might also be interested in.

If you can play Flight of the Bumblebee on the extreme right side of a full keyboard you can do it on the extreme left side, except it won't work. /_\t/_\E is a constant, (t)(1/t) = 1. Short pulses at low frequency are majorly uncertain in energy. The low frequency output sounds like mud.

Stowing this away for when my class gets to Heisenberg. I often try to do a similar demo using their graphing calculators, but I've yet to be able to throw together enough frequencies at the right amplitudes to get single pulses.

There's a really nice demo here (click on 'courseware' and scroll down to Fourier Transform, click 'demo', then click 'applet') that lets you rotate a 3D amplitude-frequency-time from the time axis to the frequency axis to examine the changes. I found it really helped me visualize what the results of calculations would be.

Very nice!

There is also a subjective side related to how we hear sound, so I would like to know what other people heard. To me, the third pulse sounded a bit sharp while the shortest one seemed to have a lower frequency. I can understand the last result based on the energy only managing to move the low frequency hairs in the inner ear, but noticing the higher half of the spectrum more than the lower half must be entirely subjective.

Glad you mentioned AM -vs- FM. Modulating amplitude requires a large frequency bandwidth to broadcast high frequencies on the carrier wave, and so they don't even try. One of the differences between the original Beatles records and the newly remastered ones is that they left in the frequencies that were originally left out due to the limitations of AM radio. The clarity that results is the opposite of what Uncle Al described in his example.

By CCPhysicist (not verified) on 04 Mar 2010 #permalink

I completely agree that many of the ideas behind the relationship of RMS widths of classical waves in time and frequency domains are the same as the ideas behind the position-momentum uncertainty relationship in QM. And you do a nice job of outlining the connection.

But you do a disservice by calling the classical wave properties "uncertainties". There is NOTHING uncertain about the classical wave pulse: it exists (and can be measured) at all those times, and it contains all those frequencies.

What makes the QM version different (and "uncertain") is that when you look for the QM particle, you will only find it in one (random) location, where the RMS uncertainty in where you would find it is analogous to the RMS width of the power (or time) spectrum of the classical pulse.

The distinction is an important enough one that none of my wave books use the term "uncertainty", and instead talk about pulse "time" and "bandwidth".

By Anonymous Coward (not verified) on 04 Mar 2010 #permalink

That's a good point. The quantum and classical theories are fundamentally different in that respect. What I'm getting at is that the mathematics of the time/bandwidth relationship and the position/momentum relationship are identical. The physics and interpretation is different in some important respects, including the one you outline.

BTW: Are people posting under the name "Anonymous Coward" or is it something the system does to anonymous commenters? If the latter, I'm going to have to complain to the bosses. Anyonymous commenting is totally ok in my book, though pseudonyms are better since it makes discussion easier to follow.

Two things:

1: The analog telephone transmission lines are (were) conditioned to band-stop above 3KHz, which was considered good enough for voice. But you may remember we used to regularly run 56KBd modems through that very same channel. Dial-up users still do. So, 56,000 bits spread over 3,000 cycles. A little over 18 bits per cycle. How? A technique known as Trellis Modulation via phase encoding. IIRC, used 2400 carrier and 32 bits per cycle. Each bit period of the cycle gets a little disturbance from the sine wave according to the bit value. There's a really neat way of displaying this on an X-Y oscilloscope. Turns out the harmonic distortion this implies just squeaks through the imperfect filters of the TelCo.

2: A couple of days ago, late night, on the Science Channel (IIRC, maybe NatGeo?), there was a show on rogue waves (>30m) causing great damage to shipping but proving impossible to predict. Most were eventually laid at the door of warm currents meeting opposing winds, but there remained the unexplained outliers. I believe it was Leon Cohen, and the Wigner function, that provided the final word on how these wave arise, and proved that they were inherently unpredictable. Interestingly enough, the behavior of wavy systems such as ocean surfaces obey the same math as waves in quantum fields. Energy distribution chaotically moves between instances of waves occasionally peaking in one instance while greatly diminishing its neighbors, then re-distributing.

I could have remembered the math and mathematician incorrectly, but the results remain clear in my mind. Good show.

By Gray Gaffer (not verified) on 04 Mar 2010 #permalink

Looking at all this brought back memories for me. The signal where the amplitude changes resembles an amplitude modulation scheme.

And phase - you can change the phase of an RF signal by changing values for inductance and capacitance in the oscillator.

This was a wonderful post. I actually downloaded the plugin so I could hear it. The one thing I'd change is to arrange for the short pulses to be repeated so that you could hear them several times in one recording.

By Carl Brannen (not verified) on 04 Mar 2010 #permalink

Re: Matt at #10
I'm the same A.C. that was complaining about quantum/classical issues on your post about distinguishing coherent states of light from thermal states of light.

I am deliberately posting under the name, it wasn't something the system did. Whether or not the system would give me the same name if I didn't enter anything I dunno.

By Anonymous Coward (not verified) on 04 Mar 2010 #permalink

AC: Ah, I see. Fine with me! I know there's been some kvetching about anonymous vs. pseudonymous posting around ScienceBlogs and I wanted to make sure it wasn't an internal filter of some kind - I've seen (non-SB) blogs do that sort of thing.

Carl: Thanks! My biggest worry was making sure the levels were OK. For some reason "demo" sine-wave sounds online tend to be set for ear-splitting levels, so I made sure to keep these reasonable.

Comment #9 more-or-less repeats what I said in my #3 (the part where I say "There is no quantum noise/fluctuations in your post, and there's none in the paper I cite above, so there's no Planck constant, which is, needless to say, a big difference."), but then goes on to something conventional, but unsupportable, "when you look for the QM particle, you will only find it in one (random) location". No to that. When you insert a high gain avalanche photodiode somewhere in an experiment, (1) changing the configuration of the experiment will cause interference effects in other signals; (2) the avalanche photodiode signal will from time to time (by which I mean not periodically) be in the avalanche state (for the length of time known as the dead time). The times at which avalanche events occur will in some cases be correlated with eerie precision with the times at which avalanche events occur at remote places in the apparatus. Although it's entirely conventional to say that a "particle" causes an avalanche event in the avalanche photodiode, that straightjackets your understanding of QFT, and is, besides, only remotely correct if you back far away from any lingering classical ideas of what a "particle" is that aren't explicitly contained in the mathematics of Hilbert space operators and states.

Try saying, instead, "QFT is about modulation of a random signal". The post more-or-less talks about modulation of a periodic signal, but we can also talk about modulation of a Lorentz invariant vacuum state. If we use probability theory to model the vacuum state (we could also use stochastic processes, but that's a different ballgame), the mathematics is raised a level above ordinary signals, in the sense that we have introduced probability measures over the linear space of ordinary signals, as a result of which the tensor product emerges quite naturally.

#5 Uncle Al - but surely that is partially also a limitation of the instrument? I have heard the Canadian Brass perform "Flight of the Tuba Bee" and it seems like with a mixture of dynamic control and highly consistent articulation, it doesn't sound muddy at all. Possibly because the tuba already has a narrower frequency spectrum than the piano?

Thanks for answering my questions. I am off to read the post you linked to...

This first came home to me when I was trying to perform spectral analysis of musical instruments for my additive synthesiser. Many commercial additive synthesisers also feature analysis and resynthesis, and most of them are so keen on getting fine frequency discrimination that they smear the sound out terribly in time.

There is actually a much more common example - the click at the start of any note played with a rapid attack. A Hammond organ, for example, or any digital synthesiser if you set the attack time of the amplitude envelope to 0. (FM and additive synthesisers with their sine wave oscillators show the effect best, of course, especially if you can set the initial oscillator phase to something other than zero.)

One should also bear in mind that the uncertainty creeps in not once but twice when you actually listen to the noise, rather than crunching it with a computer. Firstly, all the hair cells in your ear respond to the sudden onset at once (this is because the energy really is fuzzily distributed, as your article shows). Secondly, your brain is busy trying to find some harmonic pattern in that burst of stimulation, and gives up.

By Ian Kemmish (not verified) on 05 Mar 2010 #permalink

I'm not getting it. Your first graph shows 10 cycles at an amplitude of 1 unit. Your second graph shows 10 cycles at varying amplitude. The horizontal represents time and the vertical represents amplitude, not frequency. Can frequency be negative? I think I hear your final point where the tone becomes percussive and looses tonality as an analogy for uncertainty. I fail to grasp how part of a wave, that doesn't span a single cycle, can be said to span several octaves.

Re: Peter Morgan at #19

I think the original post (based on the first two paragraphs) was making an analogy between classical wave properties and the position-momentum uncertainty relationship for a massive particle (in an energy regime where the particle number is fixed).

Not an analogy between classical fields and QFT.

In that context I stand by my standard Copenhagen-interpretation picture of the measurement of a particle's position.

By Anonymous Coward (not verified) on 05 Mar 2010 #permalink

Re: Kemmish @20: I think the click is more related to the A part of the ADSR and the initial phase of the signal it is to ramp up not being phase-related, so there is an initial step function to the value where the oscillator was when it was switched in. This is an example also of the case where the modulating frequency is close to the frequency being modulated, which induces all sorts of harmonic mush. Sometimes can be useful, as in a ruing modulator, for special effects (vocoder?). An experiment: take a sample of a note with some pre-loading. Edit out the partial cycle at the front so the signal starts naturally from 0, and see of it still clicks. My rig is down at the moment, but I'll maybe get to give it a try this weekend.

By Gray Gaffer (not verified) on 05 Mar 2010 #permalink

AC, I'm tempted to call that cowardly. The History of Physics is pulling the Copenhagen interpretation into all sorts of shapes these days, based on the significantly divergent views of Bohr, Heisenberg, Dirac, von Neumann, Wheeler, at a minimum. For example, see Studies In History and Philosophy of Modern Physics, Volume 41, Issue 1, January 2010, Pages 1-8 (doi:10.1016/j.shpsb.2009.08.001), and references therein. It's rather striking that a positivistic philosophy claims that discrete events that one can observe quite directly are caused by objects that one only observes at the point of those discrete events. Of course positivism also has some woes.

I accept your move, but I wonder where it gets you.

Re: Peter Morgan at #24

> I'm tempted to call that cowardly.

Ha! Touche.

I am but a simple experimentalist, and will always choose whatever physics "recipe" seems quickest and easiest. I'll admit that's cowardly. But I've got circuits to solder. And I would claim that the work on decoherence and measurement over the last 2-3 decades has put Copenhagen on slightly more stable ground than it was in von Neumann's time (in addition to advancing alternate interpretations).

> I accept your move, but I wonder where it gets you.

I never received a license from W.E. Lamb (Jr.) to use the word "photon", so in public I try to stick to discussing massive particles in old-timey language (in an energy limit where field theory is unnecessary). However, in private I use the word "photon" all the time with complete abandon.

By Anonymous Coward (not verified) on 05 Mar 2010 #permalink

AC, Surely you don't ignore low frequency modes? The 5000 Volt nanoHz signal will fuse your solder one day. Quick and easy has its attractions, and its virtues, but the spirit of your comments belies the idea that you only choose that way.

Decoherence rightly has its adherents amongst Physicists who think beyond their calculations. I happily grant that it's a pretty good answer, even if it's not a way I like enough. I think the realism of decoherence approaches hasn't much of Copenhagen at heart, however, if it were not that there are so many Copenhagens that we can pick and choose from.

From my point of view Lamb was solid on whether we have license to talk about photons. Massive particles, with their quite definite discrete conserved quantities, their Fermionic statistics, and their superselection principles, are definitely different, and much more subtle, but still we only see them in enormous everyday assemblies; or else as thermodynamic events, just as we see photons. The interference patterns amongst the correlations of events that we take to be caused by massive quantum fields are too like those of photons, however, much to celebrate that they are very slightly more like classical particles than photons.

Using the word photon with abandon is of course endemic, and there are very few who have your fine feelings that one should endeavor not to talk that way. It would be good to have another effective shorthand way to talk about experiments. We all know that when we talk about particles there's a tricky translation into the mathematics of Hilbert spaces, states, and operators, even if, with experience, the translation is also to some extent routine. My argument in the literature is aimed towards finding something more direct, as I suppose. Matt's post happens to key into one of the sometime themes of my attempt.

It's been a long time since I studied Fourier series, and there may well be something about their use in physics that I don't understand, however...

My understanding is that the Fourier series is a series of orthogonal functions. They span the (not sure what adjective goes here) space just as conventional orthogonal vectors span Euclidean space. It's an arbitrary, though convenient, set of orthogonal functions to use. Thus, its correct to say that the original signal can be represented by the linear combination of the Fourier functions, but not really correct to suggest that it is constructed that way, or that the Fourier series reveals some unique decomposition.

I have worked with NMR and CD/ORD and their Fourier Transforms. A sterling engine performs rather well at liquid helium temperatures and for a time I thought that low temp UV spectroscopy, which gives similar type spectra for a massive increase in sensitivity, would displace NMR. This is just history.

The point is that as the temperature got really cold, doppler effects interferred with the Gaussian peak disttributions. That is, the peaks then split into three, for want of a resonant name, called Blue, Green and Red portions.

By Ivan Antonowitz (not verified) on 15 Mar 2010 #permalink