Monday, August 16, 2010

Distinguishing Randomness from Chaos

I for one always get confused when trying to distinguish the difference between randomness and chaos. So the following is an attempt to understand these.

A chaotic system is a complex nonlinear system that is highly sensitive to initial conditions. These systems are not random/disordered but are ruled by a deterministic behaviour. Simplified models of such systems generate erroneous results as initial errors accumulate during iteration.
So... in other words ‘chaotic’ systems are just highly complex and cannot be modelled accurately using simplified models or approximated initial data. They are simply called chaos because back in the age scientists couldn’t deal with such complexity and assumed that the system was unpredictable.

In contrast, the random walks we’ve been looking at this week are not individually predictable but are associated with a probability distribution. However using this distribution, predictions can be made about the properties of a collection of random walks.


  1. Leonard Susskind does a nice summary of the ideas in his Statistical Mechanics lectures.

    You're right, chaotic evolution is a property of deterministic dynamical systems. One of the characteristic properties of chaotic evolution is that two systems that are "near" each other (in phase space, let's say) at some time become very far apart at some other time.

    Visually, it happens like this: you start out with a compact & connected region of phase space (the region enclosed in a generalized sphere, lets say). We know that the equations of motion defining how each point in the region will evolve preserve the "volume" of the total region. But this does NOT imply that the region must stay "whole" - it can split apart, and bits of it can end up in very different regions after some time. Eventually, the region becomes so spread out that it essentially becomes more like a fractal than like a sphere. At this point, you need essentially infinite precision in the specification of the inital state in order to map it onto the final state.

    The telltale sign of chaos is usually taken to be an exponential divergence between two points initially close together, as time evolves. This is the idea underlying the "Lyapunov exponent".

    This is what it means to be "chaotic": you could follow the dynamics in principle, but you need to beat exponential divergence in order to do so. This is very hard.

  2. the wikipedia page on chaos theory stresses chaotic dynamics must have 3 characteristics.

    1. it must be sensitive to initial conditions;
    2. it must be topologically mixing; and
    3. its periodic orbits must be dense.

    it notes

    Topological mixing is often omitted from popular accounts of chaos, which equate chaos with sensitivity to initial conditions. However, sensitive dependence on initial conditions alone does not give chaos. For example, consider the simple dynamical system produced by repeatedly doubling an initial value. This system has sensitive dependence on initial conditions everywhere, since any pair of nearby points will eventually become widely separated. However, this example has no topological mixing, and therefore has no chaos. Indeed, it has extremely simple behaviour: all points except 0 tend to infinity.