The last major property of a chaotic system is topological mixing. You can
think of mixing as being, in some sense, the opposite of the dense periodic
orbits property. Intuitively, the dense orbits tell you that things that are
arbitrarily close together for arbitrarily long periods of time can have
vastly different behaviors. Mixing means that things that are arbitrarily far
apart will eventually wind up looking nearly the same - if only for a little
while.
Let's start with a formal definition.
As you can guess from the name, topological mixing is a property defined
using topology. In topology, we generally define things in terms of open sets
and neighborhoods. I don't want to go too deep into detail - but an
open set captures the notion of a collection of points with a well-defined boundary
that is not part of the set. So, for example, in a simple 2-dimensional
euclidean space, the contents of a circle are one kind of open set; the boundary is
the circle itself.
Now, imagine that you've got a dynamical system whose phase space is
defined as a topological space. The system is defined by a recurrence
relation: sn+1 = f(sn). Now, suppose that in this
dynamical system, we can expand the state function so that it works as a
continous map over sets. So if we have an open set of points A, then we can
talk about the set of points that that open set will be mapped to by f. Speaking
informally, we can say that if B=f(A), B is the space of points that could be mapped
to by points in A.
The phase space is topologically mixing if, for any two open spaces A
and B, there is some integer N such that fN(A) ∩ B &neq; 0. That is, no matter where you start,
no matter how far away you are from some other point, eventually,
you'll wind up arbitrarily close to that other point. (Note: I originally left out the quantification of N.)
Now, let's put that together with the other basic properties of
a chaotic system. In informal terms, what it means is:
- Exactly where you start has a huge impact on where you'll end up.
- No matter how close together two points are, no matter how long their
trajectories are close together, at any time, they can
suddenly go in completely different directions. - No matter how far apart two points are, no matter how long
their trajectories stay far apart, eventually, they'll
wind up in almost the same place.
All of this is a fancy and complicated way of saying that in a chaotic
system, you never know what the heck is going to happen. No matter how long
the system's behavior appears to be perfectly stable and predictable, there's
absolutely no guarantee that the behavior is actually in a periodic orbit. It
could, at any time, diverge into something totally unpredictable.
Anyway - I've spent more than enough time on the definition; I think I've
pretty well driven this into the ground. But I hope that in doing so, I've
gotten across the degree of unpredictability of a chaotic system. There's a
reason that chaotic systems are considered to be a nightmare for numerical
analysis of dynamical systems. It means that the most miniscule errors
in any aspect of anything will produce drastic divergence.
So when you build a model of a chaotic system, you know that it's going to
break down. No matter how careful you are, even if you had impossibly perfect measurements,
just the nature of numerical computation - the limited precision and roundoff
errors of numerical representations - mean that your model is going to break.
From here, I'm going to move from defining things to analyzing things. Chaotic
systems are a nightmare for modeling. But there are ways of recognizing when
a systems behavior is going to become chaotic. What I'm going to do next is look
at how we can describe and analyze systems in order to recognize and predict
when they'll become chaotic.
- Log in to post comments
A concrete example of a chaotic system would be enormously helpful here. I'd suggest something like the logistic map with parameter r=4. (Anyone can understand open sets in 1D! Well, maybe not JG.)
Ah, fifth paragraph, you say "for any two open spaces" when I think you mean "open sets".
I can feel it coming, just around the corner: imminantly, we shell encounter the dreaded (don't make me write it...) Hopf bifurcation. DUM DUM DUUUUM!!
Chaotic systems are a nightmare for modeling.
One can model chaotic systems quite easily. You just have to ask the right questions. It makes no sense to try to model a single trajectory in a chaotic system. But the statistical properties of the attractor are well defined and easy to model.
Mark, in âThe phase space is topologically mixing if, for any two open spaces A and B, fN(A) â© B â 0,â you didnât quantify the N.
It's great to get to this point. I've been enjoying your presentations on Chaos Theory. One of my main co-authors started telling me about 15 years ago that he considered the three biggest advances in Physics of the 20th Century to be Relativity, Quantum Mechanics, and Choas. I though it was a rhetorical trick to motivate his grad students and postdocs. But, after publishing some papers with him, and reading more deeply, I tentatively accept his prioritization. I've applied Chaos Theory to Mathematical Biology and Mathematical Economics with him. Eye-opening, mind-expanding stuff, perhaps especially so for someone raised in a semi-classical education. (i.e Newton, Laplace, et al). Thanks again!
I'm a bit confused by the definition of mixing, for two reasons.
1. The empty set is open, but if we take B = empty, it makes the definition unsatisfiable.
2. You say "they'll wind up in almost the same place", but the definition only seems to iterate f on A, not on B. Maybe it's fN(A) meets fN(B), rather than just B?
Any resemblance of this to your job interview at IBM is purely coincidental!
I hope it's ok if I ask an off-topic question, but here's one that's got me curious and has stumped my teachers so far:
If you pick a random number from the reals, in a given range, say [0, 1], what is the probability that it will be a rational number?
@8:
0.
@7:
Welcome, Bard! Great to see you here.
You're right - in addition to fouling up the quantification, I forget to say that both A and B must be non-empty sets.
In terms of the rest, I'm pretty sure I've seen topological mixing formulated in two different ways, which ultimately wind up being equivalent: one is the form I showed above, where A is continually mapped through the system's transition function, and B is fixed; the other maps both through the transition function. The key thing is that after a long enough time, whether you keep B fixed, or you allow B to transform, they're going to overlap.
And please stop worrying about the IBM interview. For goodness sake, it's been something like 15 years! I got over it ages ago :-).
You have to say what you mean by random. When working with real numbers, this means you need to specify a probability density function.
Mark made the natural assumption that you meant the uniform distribution on [0, 1], in which case the answer is zero because the rational numbers have measure zero. But it's perfectly possible to have a distribution with spikes (like a Dirac delta function) at particular numbers, so the answer to your question is not necessarily zero.
With a strictly continuous probability density function (pdf), the probability that an observation of X, a random variable that takes values in [0,1] (or anywhere else on the real line), will have a rational value is 0. But, as was mentioned by Ivan, many pdf's encountered in real life give positive values to individual numbers. For example, it may be that there is a probability of 0.5 that the value will be 0.25, and then probability 0.5 that X takes values in [1/3,1].
The idealized dartboard, where you define it with Cartesian coordinates, and the probability if hitting any given coordinate (x,y) is 0.... kinda blew my mind a while ago.
Another counter-intuitive probability result I stumbled upon the other night: Consider an experiment where I continuously flip a coin until I get ten heads in a row, after which I terminate the experiment. Let X be the number of times I end up flipping the coin before this sequence of ten heads begins. What is the most likely value of X?
The answer, of course, is zero, because even though the odds of that are 0.5^10, the odds of any other single value of X are even smaller (e.g. the odds of X=1 are 0.5*0.5^10, since you have to get a tails first. Actually, the odds of X=1..10 are all 0.5^11, because it's a bunch of don't-care flips, followed by one that must be tails, followed by the ten heads in a row. For X>10, it gets more complicated, but you can see the value still has to be smaller than 0.5^10....)
Blatantly obvious of course when you think about it, but I'll bet most people would get it wrong.
Maybe an even more fun way of phrasing the question is:
If I start playing the lottery every day beginning on January 1st, 2011, and I don't stop playing until I win, on what day am I most likely to win?
@13, 14
Yeah, our intuitions seem to be much better at estimating the expected value of a quantity rather than the single value which is most likely.
In the language of probability distributions, we're better at intuiting the mean than the mode.
@9
This reminds me of a TV commercial I saw recently, with two men in business suits standing on a corner. One starts to walk and the other stops him, saying "There's a 21% chance that car is going to run the yellow light,", and the car does. "There's a 7.5% chance that bike is going to turn without signalling," and the bike turns. "There's a 0.2% chance that-"
At that moment, a man riding an ostrich runs by.
"What are the chances of that happening?" the second guy asks.
"Hundred percent. Now."
For an insurance company, I think.
Nice way of articulating it.
According to wikipedia, you've settled for topologically transitive, whereas their definition of topologically mixing seems to say that the open sets continue to intersect for all n > N. Does that mean that the recurrence function essentially makes A cover the entire phase space after N iterations?
"In mathematics you don't understand things. You just get used to them."
John von Neumann, Quoted in G. Zukav The Dancing Wu Li Masters.
Is the universe just?