One equation I liked from this chapter is S(E)=kbln(Ω(E)). This equation relates the number of possible states available to a particular system (Ω) to a physically measureable quantity (entropy, S). I think there is something very profound here. Because the number of states available to most physical systems we encounter is so huge, it is hard to imagine different systems having more or less available states. So the idea of being able to quantify the number of states available to a system, and measure it by measuring the entropy of the system really impresses me.
This equation can be used to get an idea of the number of microstates available to a system. Not only has the natural log of the number of microstates been take, but the result is then multiplied by a constant that has order of magnitude of 10^-23. If we rearrange this equation to get Ω by itself, we find Ω(E)=exp(S(E)/ kb). A system with entropy 1J/K would have approximately 2.72^7.24E22, which is bigger than any calculator I can find will calculate. To give a comparison, exp(7.24E3)≈10^1042.
So is 1J/K of entropy a ridiculous amount of entropy? Let's do some rough calculations. We know that T=(dS/dE)^-1≈(∆S/∆E)^-1=∆E/∆S. So ∆S=∆E/T. Consider a beaker of one litre of water (i.e. 1000g). An energy change of 4.18kJ increases the temperature by 1K. This applies to liquid water, and is constant enough for all temperatures of liquid water for this rough investigation. If we say the water is at 300K (27C), then the change in entropy for this system, ∆S=4.18kJ/300K≈14J/K. So entropy of 1J/K is not a ridiculous amount of energy.
This is only a rough calculation, but is shows that 1J/K, on the macro scale, is a pretty small amount of entropy. However, this translates to an absolutely mind blowing number of microstates for the system. This also explains why worrying about statistical fluctuations on the macro scale is a foolish exercise.