"The disorder in a sequence reflects its probability"
This was a good quote from the text to focus on, to help understand disorder/entropy better. As the text states, the predictability of the weather is higher, due to a state recalling a previous states, and as a result, the disorder is lower. Likewise, the disorder of flipping a coin is higher, as the predictability is lower, due to each flip not recalling the previous state.
Just on a side note though, while I do see this as a good example to provide to understand the relationship between disorder and predictability, obviously in a real life context if a person has a consistent style of flipping a coin, we might be able to infer higher predictability upon the next flip if they start to show consistent trends.
The next point brought up by the text was the idea of correlated and uncorrelated events and how their collective amount of disorders are related. In correlated events (such as reading lecture notes, and listening to the lectopia of the same notes), one event can predict the other, so the collective amount of disorder cannot be equal to the sum of the two seperate events. However, in uncorrelated events (such as throwing a textbook out the window due to the frustration of failing an exam seeing how far it goes, and downing some various types of shots just because it's a friday night), because the two events cannot predict the other, the collective amount of disorder WILL be equal to the sum of the disorder in each of the seperate events.
We started off with the amount of disorder per message being:
which after applying a number on conditions listed on page 196 - 197, we came to Shannon's Formula, which is:
I/N = -K * [Sum (from j = 1, to M) of P(j)*lnP(j)]