In Probability We Trust?

When discussing ways that quantum computing may fail, a common idea is that it may turn out that the linearity of quantum theory fails. Since no one has seen any evidence of nonlinearity in quantum theory, and it is hard to hide this nonlinearity at small scales, it is usually reasoned that these nonlinearities would arise for large quantum systems. Which got me thinking about how to well we know that quantum theory is linear, which in turn got me thinking about something totally wacko.

For you see I'm of the school which notes that the linearity of quantum theory, if broken, almost always leads to effects which don't just do crazy things like superluminal signaling, but also have effects, like, say radically mucking with traditional probability theory. That is most nonlinear theories lead to things like the ability to amplify probability distributions: there become allowed physical process which take a mixture of 51 percent 0 and 49 percent 1, and without running repeated experiments, turn this into, say, a mixture of 99 percent 0 and 1 percent 1, but don't do anything to a mixture of 50 percent 0 and 50 percent 1. This, seems a bit radical to me, but what do I know?

But what this made me wonder was: how well has classical probability theory been tested? Okay I know this sounds really silly, right? Classical probability is a diety given entity sent down from the heavens to mollify those in need of learning measure theory and arguing over frequentest and Baysian interpretations. But if there is anything that we should learn from quantum theory, it is that the universe seems to obey our normal ideas about probabilities, and yet, on a deeper layer, our description is a departure from this in some strange manner. So how confident are we that that classical probability holds? And if you believe in nonlinear quantum theories at some large scale, do you believe that these effects won't have a large effect on classical probability?

Tags
Categories

More like this

So how confident are we that that classical probability holds?

Interesting question. I wonder how to even test such a thing. Coin flips, dice rolls, card shuffles, and similar physical model systems seem inappropriate because they're not truly random. Things like radioactive decay seem inappropriate because they're more quantum than classical events. And strictly computational approaches seem out, since I understand there's really no such thing as a true random number generator.

In the end, if you found an apparent anamoly, how would you know if it was a violation of classical probability, versus a consequence of an imperfectly random system?

So how confident are we that that classical probability holds?

Wanna gamble? We chose something random with known classical probabilities. I use classical probability theory to set the odds, you use something else. We chose what side of the other person's bet to take.

Probability theory is a mathematical theory. It can't not work for that reason - anything else gives a "Dutch Book", where bets set using the "true" probabilities are not fair.

Or, to put it another way, if probability theory isn't true then when you correctly write down all possible outcomes of an event, you might find that something else might happen instead. Or that you show that either events A and B must happen (with probability 1), but you observe event C.

I agree that a nonlinear probability theory would violate some version of the Dutch Book argument (but this is easy to do, I think by thinking up physical theories that involve nonlinear quantum theory, acasual action, or even restrictive physics of the betters (imagine physical setups where you can't take, due to the laws of physics, take both sides of the book.))

I'm fine with thinking about probability as some Platonic mathematical structure, but what I was thinking about was different. Probability certainly arises, so far, in a fundamental way in our laws of physics. Thus, shouldn't it be testable. But how do we test it? And what makes us so confident in it besides the metaphysical justification given by Dutch book arguments?

Also remind me sometime to write a post about why the Dutch book argument is an affront to financial engineers everywhere :)

I'm still having a hard time understanding what you mean when you say we could test a mathematical theory. Are you really saying we need to test if some real experiment matches the probabilistic model? You can't "test the Mathematical Theory" just like you can't test Peano Arithmetic, right? What are you asking for again?

Nonlinearity at large scales would not indicate an error in probability theory, it would indicate an error in the method being used to apply probability to the quantum mechanical system. If your model is incorrect, it doesn't matter how correct your math is.

Again I'm not asking whether probability theory is "correct" or not? I'm asking how confident we are in its application in physical experiments.

For example suppose I could design a physical system which behaved like the one I describe above:

If you prepared the system 51 percent of the time in one state and 49 percent of the other state, and then fed it into this crazy assed machine, out the other side of this crazy machine 99 percent of time in the first state would come out. But if you prepared 50 percent of the time in one state and 50 percent of the other state, and then fed it into this crazy ass machine, you would produce the first state 50 percent of the time. Of course somehow such a universe would have to "know" your probability distribution. But certainly if I start mucking with causality and choice, or, adding nonlinear evolution to quantum theory, I can end up with such a "machine."

Qetzel, the way around it is this: you test a multitude of different physical systems that are supposed to behave according to classical probability theory, and if you see the same deviation from the theory in every system you may be on to something. However, considering that casinos do this for a living, my bet is that classical probability theory will hold up just fine.

Also, I am Dutch, but I never heard of that book...

Ok, ultimately you are asking if there is some funky extra operator that we need to apply beyond the standard to get the probability out of the Schroedinger equation? That ought to be testable.

See, to me, this 51/49 going to 99/1 seems fine and perfectly doable for many purely classical phenomena too - so I obviously don't understand what you are saying. I have written many generating functions that have "blown up" randomness from skewed distributions to flat ones and vice versa.

Dave,

I would think the fact that statistical mechanics works pretty well can be seen as 'probability in physics' passing (many) empirical tests to very high accuracy. If e.g. 'physically independent' events could not be treated as 'statistically independent', then we would see e.g. that entropy is not a useful concept. But it is and thus we conclude that probability and physics fit very well (at least in every day life, things might be different near a gravitational singularity or a closed time-like loop).

this 51/49 going to 99/1 seems fine and perfectly doable

Yes, but at the same time, the same process would take 50/50 to 50/50!

Another example: suppose the two initial distributions are 1/99 and 0/100. And suppose that the process takes 1/99 to 100/0 and 0/100 to 0/100. This would be crazy: it would mean that you could distinguish perfectly between something which you would normally need on the order of 100 experiments to differentiate between.

Here is an example of a nonlinear probabilistic evolution: suppose the initial distribution is p zero and 1-p one (p is between 0 and 1). Then this changes to p^2+(1-p)^2 zero and 1-p^2+(1-p)^2 one.

Wolfgang, I'm not so sure about statistical mechanics helping (it was one of my first thoughts as well.) It's not just a case of nonlinear evolution inducing correlations, really. It seems to be more about our notions how how physical systems randomize being correct.

Oh boy ... this topic is one of favorites.

Here is how I organize my own thinking (very imperfectly).

There are at least two ways that quantum mechanics could be nonlinear -- the quantum equations of motion (Q-EOMs) might be nonlinear, or the Hilbert state-space might be curved.

Making the Q-EOMs nonlinear seems unattractive ... because any modification at all makes possible to accomplish NP-hard calculations, communicate faster than light, etc. So informatically speaking, modifying the Q-EOMS seems like the wrong way to go.

Making the state-space nonlinear seems *very* attractive ... `cuz hey, it worked in general relativity (GR). As for generalizing the linear Q-EOMs, heck we'll just project them onto the local manifold ... again just like GR.

Of course, this means we can no longer model quantum measurements as von Neumann-style projective collapse processes ... instead we have to model them as drift-and-diffusion processes ... which is more work ... but since this is that physical measurements *really* are accomplished, it's not too much of a price to pay.

We also recall how neatly GR solved the problem of the universe being infinite -- namely, its not infinite. Similarly, nonlinear QM solves the (equally profound IMHO) difficulty with Hilbert space having infinitely many (high energy) dimensions. Namely, it doesn't have them.

Where I get confused (and everyone gets confused) is generalizing Wheeler's "space tells matter how to move, matter tells space how to curve". How is it, exactly, that matter tells Hilbert space how to curve? This is the kind of thing that Roger Penrose thinks about.

Well, in a quantum trajectory approach to open systems, for either direct photo detection or homodyne detection, the equations are nonlinear. I don't think that the GRW approach is usefull yet, that takes those nonlinear terms to be fundamental, but if one puts in measurement, or loss ports that one assumes eventually get detected by something, you do get nonlinear equations.

But if the Sch. equation, or other wave equations for closed systems are nonlinear, you run into some big problems!

John S is right, the connection between the nonlinear GR eqn's and quantum measurements is the type of thing Penrose talks about.

Not sure if I buy that conciousness is in entanglement in those never structures in the brain that he refers to.....

Just to agree with Perry, "orthodox" quantum nonlinearity is present even in the POVM/Kraus operations that are introduced as axioms in Chapter II of Nielsen and Chuang's Quantum Computation and Quantum Information.

This nonlinearity is sufficiently ubiquitous that further quantum nonlinearities are mighty tricky to observe ... quantum computers are machines that (among their other purposes) are well-suited to accomplish this.

This is one of the main reasons that many fundamental questions about quantum linearity are still experimentally open, a full century after the invention of quantum mechanics.

It's quite likely I don't understand the question.
However, I'd say that we, as quantum physicists, know exactly how well classical probability holds. It works right up to the point when quantum mechanics becomes important. I mean, weren't really smart and famous people totally flummoxed because you can't describe quantum mechanical by a classical probability theory? So, isn't that the point at which classical probability breaks down?

But then, ok, I guess you're wondering if classical probability breaks in some way which is not quantum mechanical. But as you said, axiomatic probability is by definition, axiomatic, so it seems to be up to an experimentalist to find an example of nature violating the axioms -in a non-quantum mechanical way.

Were I able to find such an example, do you think I'd just post it here in a blog? I mean, that'd make one hell of a dissertation.

By A lowly gradua… (not verified) on 08 May 2008 #permalink

Just to conclude a triptych of posts on nonlinear quantum mechanics, let's imagine that linear quantum mechanics *is* true ... and that this has been proven in the best possible way -- by construction (after decades of hard work) of a quantum computer having a computational basis of (say) 1000 qubits ... hence a Hilbert space of 2^1000 complex dimensions ... plus error correction as required.

In the meantime, what about the poor shmendriks who have been studying non-linear quantum mechanics? Their time has not been wasted!

The reason is that non-linear quantum mechanics is so hard to tell from linear quantum mechanics, that (efficient but inexact) non-linear quantum simulations are good for analyzing all kinds of problems in chemistry, biology, and solid-state physics. Good for analyzing pretty much everything in this noisy world of ours, in fact, *except* simulating quantum computers!

Markk said: this 51/49 going to 99/1 seems fine and perfectly doable

I'm fine with this if the process is symmetric--that is, the same process that does this while taking 50/50 to 50/50 also takes 49/51 to 1/99. Matter-antimatter interactions could do something like this, though I'm not sure it could be done in a system where particles are not created or destroyed. But if the the nonlinearity takes 49/51 to 49/51 and 50/50 to 50/50, then there is definitely something weird going on.

Dave said: And suppose that the process takes 1/99 to 100/0 and 0/100 to 0/100. This would be crazy

In the observable universe, this would indeed be crazy, but IINM inflation cosmology can produce a result like this. That's how they explain why we have a matter-antimatter imbalance, for instance: some other part of the universe which is beyond the horizon has a corresponding excess of antimatter.

By Eric Lund (not verified) on 08 May 2008 #permalink

Were I able to find such an example, do you think I'd just post it here in a blog? I mean, that'd make one hell of a dissertation.

You should totally post a major scientific result as a comment on a blog. Sure you might not get those silly letters Ph.D., but you'd go down in history as one crazy graduate lab monkey.

To quote an old friend "maybe I'm stupid, but I just don't understand" (a) why this issue isn't settled by the success of quantum mechanics and (b) what the objection to nonlinear evolution of probability distributions is.

(a) In particular, Bell-type inequalities tell us directly about how badly classical probability breaks down; see Streater (math-ph/0002049). You can actually quantify the strength of such inequalities by the degree of statistical evidence they provide against classical probability (quant-ph/0307125).

(b) Self-reinforcing urn processes (math.PR/0610076) can do at least some of the tricks which have you worried, like taking 51/49 to 99/1 but leaving 50/50 alone. With a binary variable I have a hard time thinking of an urn scheme which will take 1/99 to 100/0 and 0/100 to 0/100, but this is because the asymptotics turn on the behavior of the corresponding continuous deterministic flows. With three or more outcomes, and so at least a 2D space to work with, it should be fairly easy. (Famous last words, those.)

Well, if you belong to the school of thought that views quantum theory as a generalization of classical probability theory then we ALREADY know that it is wrong.

Also, I just wanted to point out that if the linearity of quantum theory breaks down then we need to redo the whole von Neumann analysis of the measurement process, which is currently based on a linear theory. Many of the theorems that imply weird things would happen in a nonlinear theory assume that the usual measurement postulates (e.g. Born rule, collapse postulate, etc.)would continue to hold in the nonlinear theory, but it is not clear to me that this should be the case.

I should add that if you are a subjective Bayesian then probability theory isn't the type of thing that can be true or false of the world, but is a normative theory for rational decision making. However, it's only normative in the sense that if you agree that a certain set of decision theoretic axioms characterize your preferences then you should use probability theory to do your decision making. There are different sets of such axioms, but the Savage axioms are the most widely used and are of greater generality than the Dutch Book argument. If you disagree with any of the Savage axioms then you are not constrained to use classical probability for your decision making. Note that one of the Savage axioms is that there is a "state of the world", i.e. a point in a sample space that determines the outcomes of all possible experiments uniquely. For quantum theory, this would amount to saying that there is a deterministic hidden variable theory, which is a postulate you may be inclined to doubt. Dropping the "state of the world" axiom would require some of the other axioms that depend on it to be rewriten, but I'm pretty sure it would lead to a class of probability theories even more general that classical or quantum probability.

Matt Leifer says: If the linearity of quantum theory breaks down then, then we need to redo the whole von Neumann analysis of the measurement process, which is currently based on a linear theory.

Three remarks: (1) what you say is absolutely true, (2) most of the mathematical tools are at-hand already, and (3) this approach leads to many important practical applications!

To amplify these remark, it is well-established that von Neumann-style projective operators can be regarded as a coarse-grained limit of fine-grained drift-and-diffusion operations. The sole reason that measurement theory isn't taught this way in undergraduate QM courses is pure pragmatism -- about ten extra weeks are required to cover the basic mathematics of stochastic processes!

Quantum drift-and-diffusion operations, being infinitesimal and covariant, translate very easily onto curved quantum state-spaces. All of the usual QM phenomena survive (uncertainty relations, non-classical correlations, standard quantum limits), rather like all of the usual phenomena of special relativity are still present in GR.

What *doesn't* survive is the coarse-grained notion of von Neumann-style projection operators. These are replaced by local infinitesimal operations just like Pythagoras' theorem is replaced in GR by a local metric.

One of the best things about the Nielsen and Chuang textbook (IMHO) is that their starting quantum postulates (in Chapter 2) are carefully crafted to be consistent with this point of view, in the sense that they refer always to measurement operations rather than projection operators (the latter being a special case of the former). If we restrict our attention to infinitesimal operations (those operations close to the identity ... Carlton Caves calls them "measurement operations of the first class"), then everything in Nielsen and Chuang's textbook applies without change.

It is useful to recall that there are at least *two* reasons for studying non-Euclidean geometry. First, we know that the actual geometry of space-time is not Euclidean. Second, reduced-order approximations of complex systems are non-Euclidean---even in a strictly Euclidean universe---and so non-Euclidean mathematics has plenty of important real-world applications.

Similarly in quantum mechanics. First, it may be the case that the actual state-space of QM has a curved (Kahlerian) geometry. Second, the reduced-order approximations that folks use to analyze complex quantum systems are curved in any case---they *have* to be---and so the mathematics of curved quantum state-spaces finds plenty of real-world applications.

Tastes differ ... there are folks who love Euclidean geometry .. and folks who love Riemannian geometry ... and folks who love *both* kinds of geometry. Similarly, there are folks who love Hilbert quantum mechanics ... and folks who love Kahlerian quantum mechanics ... and folks who love *all* kinds of quantum mechanics. :)

diety given "deity-given" The other way you get thinner thighs in 30 days for a non-user fee.

Counterarguments: Las Vegas, Atlantic City, Indian casinos; State lottos. Statistical thermodynamics, noise. Device tolerences in (large) circuit boards.

In support: Modern art, politics, economics.

Conclusion: linearity + stupidity = non-linear chaos

a) In particular, Bell-type inequalities tell us directly about how badly classical probability breaks down; see Streater (math-ph/0002049). You can actually quantify the strength of such inequalities by the degree of statistical evidence they provide against classical probability (quant-ph/0307125).

I would say that Bell-type inequalities tell us how local realistic probabilisitc theories break down. But there are certainly nonlocal probabilistic theories of quantum theory.

(b) Self-reinforcing urn processes (math.PR/0610076) can do at least some of the tricks which have you worried, like taking 51/49 to 99/1 but leaving 50/50 alone. With a binary variable I have a hard time thinking of an urn scheme which will take 1/99 to 100/0 and 0/100 to 0/100, but this is because the asymptotics turn on the behavior of the corresponding continuous deterministic flows. With three or more outcomes, and so at least a 2D space to work with, it should be fairly easy. (Famous last words, those.)

Well if you could get those latter ones to work you would be able to solve NP-hard problems in one time step! Note I'm requiring one use of the systems whose probability distribution is the one give.

For the other example, point is if that if I try to solve this via a simple Markovian process,
51 Pr(0->0) +49Pr(1->0)=99
51 Pr(0->1) +49Pr(1->1)=1
50 Pr(0->0) +50Pr(1->0)=50
50 Pr(0->1) +50Pr(1->1)=50
I get a negative number for one of these evolutions.

For the other example, point is if that if I try to solve this via a simple Markovian process,
51 Pr(0->0) +49Pr(1->0)=99
51 Pr(0->1) +49Pr(1->1)=1
50 Pr(0->0) +50Pr(1->0)=50
50 Pr(0->1) +50Pr(1->1)=50
I get a negative number for one of these evolutions.

You've hit upon an important point. I'm not sure Markovian processes are always valid and I'm not the only one who thinks this. In fact Oliver Penrose (Roger's bro) wrote about such problems in his book on statistical mechanics back in the late '60's.

To me, your example seems to resemble some sort of unstable equilibrium state and can be explained classically (heck, it's reminiscent of the old omega problem in cosmology - omega has to be exactly 1 for a flat universe which, amazingly, seems to be the case).

http://tf.nist.gov/ion/other/pubs.htm

Reference 26 in the "Others" category on the webpage above contains a link to one test of the linearity of quantum mechanics. They limit any nonlinear perturbation to the nuclear Hamiltonian of a Be+ ion to 4 parts in 10^27 (where the scale is related to the nuclear binding energy).

By schwagpad (not verified) on 09 May 2008 #permalink

For the other example, point is if that if I try to solve this via a simple Markovian process,
51 Pr(0->0) +49Pr(1->0)=99
51 Pr(0->1) +49Pr(1->1)=1
50 Pr(0->0) +50Pr(1->0)=50
50 Pr(0->1) +50Pr(1->1)=50
I get a negative number for one of these evolutions.

You've hit upon an important point. I'm not sure Markovian processes are always valid and I'm not the only one who thinks this. In fact Oliver Penrose (Roger's bro) wrote about such problems in his book on statistical mechanics back in the late '60's. However, they are clearly not the only problem (e.g. see this paper, which needs a serious rewrite, but emphasizes that the problem isn't just in Markovian processes).

To me, your example seems to resemble some sort of unstable equilibrium state and can be explained classically (heck, it's reminiscent of the old omega problem in cosmology - omega has to be exactly 1 for a flat universe which, amazingly, seems to be the case).

I am not sure about what you are talking, but I think, that decoherence and similar imperfectness 'correcting' quantum mechanics phenomens... And thus no any phenomens, which can't be generated with bunch of classical sferical bits... Probability don't have definiot of "perfect" probability or "inperfect" probability, becouse to check do probability is always perfect (50% vs 50%) need it do infinity time. And it nobody can do.

Dave Bacon says: ... most nonlinear theories lead to things like the ability to amplify probability distributions: there become allowed physical process which take a mixture of 51 percent 0 and 49 percent 1, and without running repeated experiments, turn this into, say, a mixture of 99 percent 0 and 1 percent 1, but don't do anything to a mixture of 50 percent 0 and 50 percent 1 ..

That is a really interesting question, Dave.

Your example is generically true of quantum theories that are dynamically nonlinear, but IMHO it is not generically true of quantum theories whose nonlinearity resides in the geometry of the state-space.

AFAICT, geometrically nonlinear quantum mechanics exhibits a rather different departure from orthodoxy, as follows.

At the same level of entanglement at which a quantum computer begins to exhibit anomalous noise levels (say at 100 qubits)---for the physical reason that its state-space geometry is running short of Hilbert bases---its specific heat will also begin to exhibit anomalies, for the same physical reason.

That reason being (in the language of renormalization theory) nonlinear quantum dynamics imposes a dynamical cut-off, whose origins are geometric, not at high energy, but at high entanglement. And yes, AFAICT this cut-off will exert observable (but subtle) effects on the (renormalized) states of physical systems.

I'm no expert on string theory, but isn't this kind of geometric cut-off pretty much the same mechanism by which string theory (however it turns out to work) cures the high-energy divergences of quantum field theory?

This is yet another example of why geometric quantum mechanics (IMHO) is equally interesting to mathematicians, physicists, engineers, and information theorists, for both fundamental and practical reasons.

Those are two interesting articles, Hrvoje.

When I read them with an quantum system engineer's eye, I say to myself "we engineers always assume that orthodox quantum mechanics is true, therefore for us, Hrvoje's formalism has to be an effective theory of an approximate (but orthodox) quantum mechanical formalism ... so what approximate formalism might that be?"

Now, from an orthodox engineering point of view, quantum mechanics has three orthodox state-spaces, which are very much like the three bowls of porridge in Goldilocks and the Three Bears.

The Mama Bear's medium-size state-space is the linear Hilbert space of orthodox quantum mechanics. This is the state-space that we teach students in their first QM course.

The Baby Bear's small-size state space is (of course) the smaller-dimension algebraic manifolds of the Hartree approximation, matrix product states, and the Dirac-Frenkel variational calculus in general.

The Papa Bear's large-size state-space is (of course) the "dictionary" of states that appears in compressive sampling (CS) theory. In general, a CS quantum state dictionary has a much larger dimension than Hilbert space, but since CS theory specifies a priori that we are interested only in sparse states (which are algorithmically compressible), it turns out that CS quantum mechanics too is computationally tractable ... despite CS quantum space having a dimensionality that is vastly larger even than Hilbert space!

Thus all three state-spaces are compatible, and which one we calculate in is largely a matter of personal taste and practical necessity.

As for which state-space is "true" ... well ... for most engineering purposes the three state-spaces yield the same predictions ... and so the really important issue for engineers is which state-space best facilitates robust, efficient, and accurate calculations.

From the preceding point of view, Hrvoje, it seems to me that your dynamical equations most closely resemble the Fokker-Planck equation for the quantum stochastic evolution of a probability density on the curvilinear quantum state-space of (e.g.) coherent states and/or matrix product states.

The above thinking is tightly constrained by the assumption that orthodox linear quantum mechanics is 100% correct ... this may well be a stronger assumption than many people are willing to make!