This is the at least the 3rd time I have been introduced comprehensively to thermodynamics. I think that this is a good thing, as I always forget something in between times that I am being introduced. However, I always find something a little off about the discussion of entropy. I understand that systems go from highly ordered to random if now external work is done on the system. This makes intuitive sense. But I wonder, is it always true? There are always many, many more unordered states that a system can be in than ordered ones, which I have always found is the justification that an ordered system will decay to randomness. Logically, if the particles in the state are moving randomly, then the order is much more likely to be in the lowest state. But my concern is that isn’t it possible for the random movement of particles to happen to find itself, even if only temporarily, in a state higher in entropy than what it started in? Even if it is very very unlikely? Isn’t it possible that a box containing O2 and N2 gas molecules will, after an extremely long period of time, happen across a state where all the O2 molecules are on one side of the box, and all the N2 molecules on the other? Even if that state only lasts for an instant? Furthermore, when this state does occur, does absorption of heat from its surroundings accompany it?
Another case I think of is a body that consists of only a few particles (perhaps 100), at some non-trivial temperature. All the atoms are vibrating in random directions. If the direction of vibration of each atom is independent, and if the object is left for a long enough time, could there be a random synchronisation of the motions of the atoms, even for only a moment, that results in the body moving in a united direction? Obviously this small amount of kinetic energy is quickly turned back to heat energy through friction, but is this a possible example of heat energy spontaneously converting to kinetic energy?
Both these situations are quite trivial. The kinetic energy gained would not be worth harnessing, and the time when the order is achieved in the gas filled box would not be predictable. But it is examples like these which make me sceptical of ‘universal’ laws which state order is never spontaneously gained, or heat energy can never be converted into a more useful form (without an excess of heat energy (i.e. a large amount of heat being used to do the much less work involved in raising a hot-air balloon)).
Are these situations ignored because they are just so statistically unlikely, or do I have some fundamental flaw in my understanding of the second law of thermodynamics?
This comment has been removed by the author.
ReplyDeleten a 1957 paper*, E.T. Jaynes pointed out that there are two conceptual "parts" to an act of statistical mechanics:
ReplyDelete1) The "physical" part, where set of physical states and their observables are enumerated, and
2) the "inference" part, which is a pure exercise in statistical inference. This part is not really physics, but an application of probability theory.
This means that the laws of statistical mechanics are laws for probabilistic inferences.
Although statistical mechanics refers to the "system", one needs to remember that the real objects on which one conducts stat. mech. are ENSEMBLES. The ensemble is the sample space of all possible states of a system (with a probability measure) consistent with a set of chosen constraints. Since this is is your 3+th time doing this you will know what the constraints usually are on common ensembles, e.g.
Microcanonical: absolute energy, number of particles
Canonical: average energy, absolute number of particles
Grand Canonical: average energy, average number of particles
For ensembles where only an average is used to constrain the ensemble, there are fluctuations (members of the ensemble where the observable has different values). Only the average is constrained.
There is a probability distribution associated with the ensemble. To get to thermodynamics, one identifies physical measurable quantities with moments of the distribution (average, variance, etc.).
The entropy is a property of the ensemble, not the system. To see this, just note that the entropy is the sum over elements of the ensemble of plogp (p is the probability of the element).
So maybe this last point clarifies things. You need to be careful to remember that saying the entropy does not decrease is making a statement about the ensemble, not the system. It is a statement about the way the probability distribution changes (or does not change). It is not a statement about what any system will or won't do.
The confusion arises also because the answer to your last question is "yes". When the sample space is large, the distribution of an ensemble constrained as stated above become VERY narrowly peaked about the most probable value. I mean REALLY narrowly peaked. Because of this, it is easy to confuse the separate ideas of ensemble and system for macroscopic things.
*E.T. Jaynes, "Information Theory and Statistical Mechanics" Phys. Rev. 106, 620-639 (1957).
I see. So the two situations I described above, each state I described were different systems in the same ensemble? The entropy of the ensembles didn't change, they just momentarliy took on one of the less likely likely states in that ensemble?
ReplyDelete