I've watched one of the lectures on statistical mechanics Seth posted earlier and found it interesting. It got me thinking about information and entropy (again). I think I am slowly getting my head around it.
The lectures gave a general overview of many of the thermodynamic properties and how they relate to systems and probabilities. However the most interesting part I found was when temperature was introduced. Normally when thinking about thermodynamics or any chemical or biological system one of the first things we think of is temperature (along with pressure and density). However temperature is not strictly defined it is somehow the average kinetic energy of the molecules in the system - it is an emergent property. The lecture gave it a new definition which was the derivative of energy w.r.t entropy (dE/dS - or the partial derivative). Both energy and entropy are two properties that can be measured for single molecules.
This led to a discussion on Landaur's principle which is a measure of the minimum energy released when one bit of information is "destroyed": E ≥ kT ln 2. I assumed because in information theory a value can (usually) be either 0 or 1 that that is where the 2 comes from. So perhaps the minimum energy that can be released as heat when one base pair is "destroyed" is E ≥ kT ln 4. I wonder if this is the correct way of thinking about this?