’ wj ’ β

wj ln wj + ± wj E j (17.7)

j j j

(the minus sign before β is for later convenience) is maximized by equating

all partial derivatives to wi to zero:

’1 ’ ln wi + ± ’ βEi = 0, (17.8)

or

wi ∝ e’βEi . (17.9)

be a biased choice that is only justi¬ed by additional knowledge. Although this principle leads

to exactly the same results as the Gibbs postulate that all realizations of the ensemble are

equally probable, it introduces a subjective ¬‚avor into physics that is certainly not universally

embraced.

7 Lagrange undetermined multipliers are used to ¬nd the optimum of a function f (x) of n

variables x under s constraint conditions of the form gk (x) = 0, k = 1, . . . , s. One constructs

the function f + s k=1 »k gk , where »k are as yet undetermined multipliers. The optimum

of this function is found by equating all partial derivatives to zero. Then the multipliers are

solved from the constraint equations.

17.3 Identi¬cation of thermodynamical variables 457

The proportionality constant (containing the multiplier ±) is determined by

normalization condition (17.5), yielding

1 ’βEi

wi = e , (17.10)

Q

e’βEi .

Q= (17.11)

i

Q is called the canonical partition function. The multiplier β follows from

the implicit relation

1

Ei e’βEi .

U= (17.12)

Q

i

As we shall see next, β is related to the temperature and identi¬ed as 1/kB T .

17.3 Identi¬cation of thermodynamical variables

Consider a canonical ensemble of systems with given number of particles

and ¬xed volume. The system is completely determined by its microstates

labeled i with energy Ei , and its thermodynamic state is determined by the

probability distribution {wi }, which in equilibrium is given by the canonical

distribution (17.10). The distribution depends on one (and only one) param-

eter β. We have not introduced the temperature yet, but it must be clear

that the temperature is somehow related to the distribution {wi }, and hence

to β. Supplying heat to the system has the consequence that the distribution

{wi } will change. In the following we shall ¬rst identify the relation between

β and temperature and between the distribution {wi } and entropy. Then we

shall show how the partition function relates to the Helmholtz free energy,

and “ through its derivatives “ to all other thermodynamic functions.

17.3.1 Temperature and entropy

Now consider the ensemble-averaged energy, which is equal to the thermo-

dynamic internal energy U :

U= wi E i . (17.13)

i

The ensemble-averaged energy changes by changing the distribution {wi },

corresponding to heat exchange dq:

dU = dq = Ei dwi . (17.14)

i

458 Review of statistical mechanics

Note that, as a result of the normalization of the probabilities,

dwi = 0. (17.15)

i

At constant volume, when no work is done on the system, the internal

energy can only change by absorption of heat dq, which in equilibrium equals

T dS:

1

dU = dq = T dS or dq = dS. (17.16)

T

So, in thermodynamics the temperature is de¬ned as the inverse of the

integrating factor of dq that produces the di¬erential dS of a state function

S (see the discussion on page 426). Can we ¬nd an integrating factor for dq

in terms of the probabilities wi ?

From (17.10) follows that

Ei = ’β ’1 ln wi ’ β ’1 ln Q, (17.17)

which can be inserted in (17.14) to yield

dq = ’β ’1 ln wi dwi . (17.18)

i

Here use has been made of the fact that i dwi = 0. Using this fact again,

it follows that

β dq = d(’ wi ln wi ). (17.19)

i

So we see that β is an integrating factor for dq, yielding a total di¬erential of

a thermodynamic state function ’ i wi ln wi . Therefore this state function

can be identi¬ed with the entropy S and β with the inverse temperature 1/T .

Both functions can be scaled with an arbitrary constant, which is determined

by the convention about units in the de¬nition of temperature. Including

the proper constant we conclude that

1

β= , (17.20)

kB T

S = ’kB wi ln wi . (17.21)

i

These are the fundamental relations that couple statistical mechanics and

thermodynamics.8 Note that the entropy is simply equal to the informa-

tion function H introduced in (17.4), multiplied by Boltzmann™s constant.

8 Several textbooks use these equations as de¬nitions for temperature and entropy, thus ignoring

the beautiful foundations of classical thermodynamics.

17.4 Other ensembles 459

Strictly, the entropy is only de¬ned by (17.21) in the case that {wi } rep-

resents a canonical equilibrium distribution. We may, however, extend the

de¬nition of entropy by (17.21) for any distribution; in that case ¬nding

the equilibrium distribution is equivalent to maximizing the entropy under

the constraint that i wi = 1 and the additional constraints given by the

de¬nition of the ensemble (for the canonical ensemble: constant N and V

and given expectation for the energy U : U = i wi Ei ).

17.3.2 Free energy and other thermodynamic variables

The entropy is proportional to the expectation of ln wi , i.e., the average of

ln wi over the distribution {wi }:

S = ’kB ln wi . (17.22)

From the canonical distribution (17.10), it follows that

ln wi = ’ ln Q ’ βEi , (17.23)

and taking the expectation over both sides, we obtain

S U

’ = ’ ln Q ’ , (17.24)

kB kB T

which reshu¬„es to

’kB T ln Q = U ’ T S = A. (17.25)

This simple relation between Q and the Helmholtz free energy A is all we

need to connect statistical and thermodynamic quantities: if we know Q as

a function of V and β, we know A as a function of V and T , from which all

other thermodynamic quantities follow.