PHYS3410
BIOPHYSICS
II
Lecture Notes
Section
I:
Thermodynamics of Biological Systems
Lecture
5 : Statistical Interpretation of Entropy
So far
the concept of entropy as a state function is entirely macroscopic.
The validity of Second Law stems from the reality of irreversible
processes.
In stark
contrast to these processes, which we see all around us, the
laws of both classical and quantum mechanics are time symmetric:
a system that can evolve from state A to a state B, can also
evolve from state B to state A. For example, the spontaneous
flow of gas molecules from a higher density region to a lower
density region and its reverse (which violates the Second
Law), are both in accord with the laws of mechanics.
Yet all
irreversible macroscopic processes are consequences of molecular
movement, collisions and transfer of energy. How can irreversible
macroscopic processes emerge from the reversible motion of
molecules?
Ludwig
Boltzmann (1844 - 1906) attempted to reconcile the macroscopic
and microscopic approaches and derive mechanical interpretation
of entropy. Boltzmann was deeply influenced by the Darwin’s
idea of evolution.
Entropy,
Disorder, Probability
Spontaneous
processes in nature are often associated with increase of
disorder.
This
tendency toward disorder and homogeneity can be related to
time's arrow.
What
do we mean by order and disorder?
Descriptions
of the states of a system can be divided into two classes:
Microscopic
descriptions: Describe the system in highest degree of
detail. (e.g. positions and velocities of all the atoms in
the system). This is usually impossible for us to do.
Macroscopic
descriptions: Overall state of the system: averaging over
large number of atoms or molecules (e.g. pressure) or identifying
the internal energy of ideal gas with the kinetic energy of
the molecules. These are measurements and calculations we
can make.
To gain
deeper understanding of a particular system, we need to relate
our macroscopic measurements to theoretical descriptions of
the microscopic entities.
How
do microstates relate to any particular macrostate?
Often
an observable macrostate can arise from very different arrangement
of microstates.
If many
macrostates are possible and these can be constructed from
different configuration of microstates, why do we observe
some macrostates and not others?
All microstates
corresponding to macrostates of a given energy are equally
probable.
Therefore,
the greater the number of microstates which lead to a particular
macrostate, the greater the probability of observing that
macrostate.
Consider
a system of eight objects:

The thermodynamic
probability, W,of observing a particular macrostate is proportional
to number of microstates that belong to it. For the above
example it can be determined by counting the number of ways
the objects can be rearranged within the sixteen squares without
altering the macrostate. The most uniform or homogeneous macrostate
is the most probable. This state corresponds to the least
ordered state. So:
the more ordered states are the least probable!
In actual
thermodynamic systems the numbers of particles (and consequently
microstates) are much larger (NA = 6.02 x 1023
particles per mole), so probability distribution is sharply
peaked at the homogeneous configuration.
Boltzmann
defined entropy of an observable macrostate in terms of the
number of its' microstates W:
S = kB
ln W (5.1)
kB
= Boltzmann constant = 1.381 x 10-23 JK-1
Macrostates
with microstates, which are not statistically uniform (microscopic
order), are usually not in equilibrium and will spontaneously
evolve to microscopic disorder (equilibrium).
Consider
the macrostate of a box containing a gas with N1
molecules in one half and N2 in the other.
The total
number of ways in which the (N1 + N2)
molecules can be distributed between the two halves, such as
that N1 is in one half and N2 is in the
other, is equal to W. The number of distinct microstates with
N1 molecules in one half and N2 in the
other:
(5.2)
The macrostates
with large W are more probable. The irreversible increase
in entropy then corresponds to evolution to states of higher
probability. Equilibrium states are those for which W is a
maximum (for eqn. 2.5, W reaches a maximum when N1
= N2).
Planck's
point of view:
Define
Entropy S, which always changes in the same sense in all natural
processes and assumes maximum at system equilibrium. Entropy
is a function of thermodynamic probability S = f (W)
We already
know that S is an extensive additive function of state: S1+2
= S1 + S2
But from
probability theory: W1+2 = W1W2!
Combining
the two relations, the functional form: S = k ln W
This eqn.
is also in accord with the Third Law: S = 0, W = 1 (only one
microstate possible) at absolute zero of temperature, where
U is minimum. The connection between entropy and disorder
plays an important role in interpreting the thermodynamic
behavior of living systems.
Exercise:
In a “3 up” game, three coins are tossed simultaneously and
chosen combination of heads and tails wins. If each toss is
a macrostate, consider the number of microstates for different
type of combinations.
|