"The disorder in a sequence reflects its probability"

This was a good quote from the text to focus on, to help understand disorder/entropy better. As the text states, the predictability of the weather is higher, due to a state recalling a previous states, and as a result, the disorder is lower. Likewise, the disorder of flipping a coin is higher, as the predictability is lower, due to each flip not recalling the previous state.

Just on a side note though, while I do see this as a good example to provide to understand the relationship between disorder and predictability, obviously in a real life context if a person has a consistent style of flipping a coin, we might be able to infer higher predictability upon the next flip if they start to show consistent trends.

The next point brought up by the text was the idea of correlated and uncorrelated events and how their collective amount of disorders are related. In correlated events (such as reading lecture notes, and listening to the lectopia of the same notes), one event can predict the other, so the collective amount of disorder cannot be equal to the sum of the two seperate events. However, in uncorrelated events (such as throwing a textbook out the window due to the frustration of failing an exam seeing how far it goes, and downing some various types of shots just because it's a friday night), because the two events cannot predict the other, the collective amount of disorder WILL be equal to the sum of the disorder in each of the seperate events.

We started off with the amount of disorder per message being:

I= Nlog2(M)

which after applying a number on conditions listed on page 196 - 197, we came to Shannon's Formula, which is:

I/N = -K * [Sum (from j = 1, to M) of P(j)*lnP(j)]

I still think consecutive flips of an actual coin would be independent events. Even if you could determine that a person's flipping style has a marginally higher chance of flipping an even number of times compared to an odd number of times, because the coin won't necessarily be flipped starting on the same side, this wouldn't make consecutive flips correlate with each other.

ReplyDeleteCoin flipping is a good example of chaos theory. If you could narrow down all the affecting factors of the coin flip (such as the starting position, force of the flip, the rate of spin, distance to ground, dimensions/regularity of the coin, environmental variables etc) it would be possible to accurately predict whether heads or tails will occur.

ReplyDeleteBut according to chaos theory, we can only predict a chaotic event a limitted distance into the future, before the chaos take over again. This is because we can never know the initial conditions perfectly. If we improve the accuracy of our initial conditions, we can improve the accuracy of our predictions for a further distance into the future, but we cannot predict the outcome indefinitely.

ReplyDeleteIf we measure all the variables Heather stated before every flip, then we might be able to predict the outcome. But that's a lot of effort. I still think the flipping of a coin should be considered an independant event, assuming the person is not trying to bias the flip.

But while that may seem like a lot of effort, we have to remember that even things like the weather take into account a number of variables (humidity, air pressure, wind, the weather of surrounding areas, recent weather patterns) that are taken from both the time around when the weather is being predicted, and a couple of days leading up to it. but yes, it does seem like a lot for a coin flip, I didn't realise that my suggestion would be looked at this deeply, to include even the idea of chaos theory.

ReplyDelete