Tuesday, September 7, 2010

Entropy Phenomena

I learnt a lot this chapter. I didn’t realise how many of the principles I had learnt in biology and chemistry could be explained in terms of entropy. From osmotic pressure, to molecular crowding, to surface tension, to charge shielding by ions. When I read about the molecular crowding section, I was surprised to see that entropy could increase order, but I knew I’d seen something like it before. I later remembered that it was phenomenon of granular convection. Shaking a jar of nuts results in the largest nuts (Brazil nuts) rising to the surface (depending on the ratios of the sizes of the nuts). This is an example of a process which creates order. It is similar, since the larger objects are separated from each other, allowing the smaller objects more room to move in. However, I can also see the flaws in this comparison. First of all, the brazil nuts move in a specific direction, that is opposite to gravity. In molecular crowding, there is no preferred direction, or at least not on the scale of molecules. Also, the length scales are not quite right; obviously nuts are much too big to be effected by the energy kbT, and the nut size ratio is much smaller than the ration of water molecules to proteins. But I can see similarities in these two phenomenon.

I have to say though, I’m a little annoyed at this chapter. The explanation of many of these phenomena includes suggesting that the particles “are willing to pay some … energy in order to gain entropy” (pp268-269). I think this is a little too simplified. Explaining it like this implies that the particles are aware of their surroundings. But we know that they cannot be. They do not measure the entropy or free energy of their environment, and then act according to that. The just act in a random manner, or in a manner as random as they can given the constraints they are under. The behaviour of individual particles is unpredictable. It is only the whole distribution of behaviours that we can make inferences about.

3 comments:

  1. Good point. Nelson does anthropomorphize somewhat for pedagogy, but you are also alluding to the "subjectivity" problem that sometimes raises its head when the information-theoretic interpretation of entropy is put forward.

    You're right, the entropy is a property of the distribution, and the use of the maximum entropy principle in statistical mechanics is a statement that systems are more likely to be found in the most probable regions of their state space.

    It is also true that equating "order" with entropy sometimes backfires. Remember that proteins can be cold-denatured. A messy room has the same entropy as a clean room, as long as the information about where everything is has not been lost in the messing (or the cleaning). In both cases, the entropy is that of a "pure state", or a "point in phase space" i.e. zero (as long as the internal state of each object of the room is ignored, and nothing gets broken in the messing/cleaning). It is really the LABELS "messy" and "clean" which carry the entropy, if we presume there are more possible messy rooms than clean rooms

    ReplyDelete
  2. What is the 'subjectivity' problem of the information theory of entropy?

    ReplyDelete
  3. Good analogy there Seth.
    I agree Mitch, it is quite interesting to see just how much we've learnt before can be explained through entropy. And yes, I too am a bit annoyed by the statement which you mentioned, it does seem a bit redundant in the sense that we know that molecules don't do their own "energy taxes". It's randomness and Distribution which does the accounting.

    ReplyDelete