You are on page 1of 11

Thermodynamics

Y Y Shan

(v) Maxwell Speed Distribution --kinetic theory of gases


.
In terms of energy: the number of gas molecules in the energy range of

+ d is (Eq.7-4c on p.102):

V e d 3 p
kT
2
a ( ) = N 3
= N (2mkT ) e d 3 p ,
h
Z

Eq7-4g

where the partition function Z of ideal gas is given by (Eq.7-4d, p.103).


Since =

1 2 p2
mv =
, d 3 p = p 2 sin dddp ,
2
2m

In terms of speed, the number of molecules in the speed range of v v + dv is:


2

p
a (v) = N (2mkT ) sin d d e 2 mkT p 2 dp

=0
=0

= N (2mkT )

3
2

3
2

(4 ) e

m
= N 4

2kT
= Nf (v)dv

3/ 2
2

v e

m 2
v
2 kT

m 2
v
2 kT

(mv) 2 d (mv)

dv

Where
3/ 2

m
m 2 2 kT v
f (v) = 4
ve
2kT

Eq7-4h

This is called the Maxwells speed distribution function for the molecules of an ideal gas.

AP3290

105

Thermodynamics

Y Y Shan

From this Maxwells speed distribution function, several characteristic molecular speeds
can be calculated, such things as what fraction of the molecules have speeds over a
certain value at a given temperature.
When

df (v)
2kT
= 0 , the most probable speed is obtained: v P =
, which is
dv
m

temperature dependent.

The average speed of the molecules can be calculated by: v = vf (v ) dv =


0

8kT
m

(vi) The theorem of equipartition of energy


The theorem of equipartition of energy states that molecules in thermal
equilibrium have the same average energy associated with each degree of freedom of

their motion and that the energy is:


freedom,

, for a molecule with three degrees of

The equipartition results in :

1
1
1
1
mv x2 = mv 2y = mv z2 = kT
2
2
2
2

1
1
1
3
mv 2 = m(v x2 + v y2 + v z2 ) = m(v x2 + v y2 + v z2 ) = kT ,
2
2
2
2

can be obtained from the Maxwells speed distribution, shown to follow from the MaxellBoltzmann distribution.

1
1
1
m
mv 2 = m v 2 f (v)dv = m v 2 4

2
2
2
2kT
0
0
Since P =
P=

NkT
for ideal gases,
V

3/ 2

v 2e

m 2
v
2 kT

dv = L =

3
kT
2

we can the expression of pressure as:

N
2N 3
2N 1 2
kT =
( kT ) =
( mv )
V
3V 2
3V 2
P=

AP3290

1N
mv 2
3V

Eq7-4i

106

Thermodynamics

Y Y Shan

7.5 Statistical entropy

In Chapter 4 (p.56), the principle of entropy increasing in classical


thermodynamics, which is an alternative statement of the second law, says: The entropy
of an isolated system can never decrease,

S F S I = S (universe) 0 ,
where a system and its surroundings together (``the universe'') form an isolated system,
F, I denotes Final macrostate and Initial macrostate. Note it should not be confused
when we talk about entropy S and its change S .
Statistical thermodynamics can explain microscopically the spontaneous increase
of entropy.

7.5.1 The most probable Macrostate, its total number of microstates, and entropy
Let's

consider

the

example

of

the

system

of

66

chips

again(indistinguishable).
Imagine starting with a perfectly ordered all-blue microstate, then choosing a
chip at random, tossing it. After repeating this kind of tossing a few times, there are
highly likely to be some chips in their green side, and it is nearly impossible to be still in
n

1
its all-blue state (the chance of the system remaining all-blue is only about after
2
times tossing. As time goes on with more tossing, the number of greens will almost
certainly increase. Here are some snapshots of the system, which are taken after every 10
tossing. The number of greens, nG ,is 0, 3 (10toss), 5(20toss), 9(30toss), 12(40toss),
15(50toss), 15(60toss), 17(70toss), 18(80toss).

AP3290

107

Thermodynamics

Y Y Shan

Here is a graph of the number of greens up to over 1000 times tossing.

We see that, after the first many times tossing, the system saturates at 18 6 almost all
of the time. These fluctuations are quite large in percentage terms, 33% , but then it is a
very small system.

If we now look at a larger system of 30 30 chips, we see that

fluctuations are still visible, but they are much smaller in percentage terms-the number of
greens is nG saturate at 450 30 , or 7% .
Since it can be calculated and proved that, for the 6X6 system, when the number of green

nG = 18 , the number of microstates for the system reaches maxium:

G =18

= C3618 = 36! /(18!18!) = 9.075 109 > n

For a 30X30 system,

G 18

G = 450

450
= C900
> n

G 450

The above example tells:

AP3290

108

Thermodynamics

Y Y Shan

Statistically, a system is evolving from Macrostates (those nGreen 18 in this example)


9

with less microstates (< 9.075 10 ) to the most probable Macrostate (saturates at 18
9

Green, 18 Blue) having the largest numbers of microstates ( 9.075 10 ). Or it can be


said that when a system evolves from one macrostate to the other, the corresponding
number of microstates will never decrease:

What has this statistical conclusion to do with entropy?

If we compare it with the principle of entropy increasing in classical


thermodynamics, saying that the system is evolving from a macrostate having lower
entropy to one having higher entropy, or that the entropy of an isolated system can never
decrease.
Therefore, the increase of entropy from one macrostate to the other in classical
thermodynamics can be understood and correlated to the increase of total number of
microstates corresponding to one macrostate to another.

AP3290

109

Thermodynamics

7.5.2

Y Y Shan

The Boltzmann statistical entropy


So what is the relationship between entropy S and the number of microstates ?

Do they equal to each other? No, because if we double the size of a system ( N 2 N ),
the number of microstates does not increase from to 2 , but to 2 .

Obviously we can see in this example: ln 2 particle = 2 ln 1 particle . So


extensive quantity (see page 5), but

ln

is not an

is.

Deriving the Boltzmann entropy fomula: refer to the discussion on page 94, for a
Boltzmann system of N particles {ai }, the total number of microstates of Boltzmann
distribution(i.e. most probable distribution, having maximum ) is:

g ia
= N !
,
ai !
i
i

AP3290

N = ai ,
i

U = ai i
i

110

Thermodynamics

Y Y Shan

Step 1:

g iai
= ln N !
ln = ln N !
ai !
i

= ln N!

ln( ai !)

+ ln( g iai )

ln ai !

+ ln g

ai
i

= N (ln N 1) a i (ln a i 1) + ln g iai


i

= N ln N N + a i ai ln a i + ln g iai
i

a i ln ai + ai ln g i

= N ln N

where Stirling's formula: ln x!= x (ln x 1) is applied. So we obtain:

ln = N ln N ai ln ai + ai ln g i
i

7.5a

Step 2: from the first law: dU = dQ + dW = TdS PdV ,


i.e. TdS = dU + PdV ,
The statistical internal energy and pressure can be expressed (equations 7-3a,d,

ln Z
N ln Z
, P=
, where = 1 / kT

p98,p101): U = N

TdS = dU +
dS =

d (ln Z )

k
[dU + Nd (ln Z )] = kd [ U + N ln Z ] = d [k ( N ln Z + U )]
kT

Thus we obtain the statistical entropy in respect with partition function Z:

S = k ( N ln Z + U )

Eq.7-5b

Replacing ln Z by: ln Z = ln Z + ln N ln N = ln N + ln

Z
N

S = k ( N ln Z + U ) = k N ln N + (ln ) N + U
N

Replacing

N =

AP3290

ai , U =

a
i

111

Thermodynamics

Y Y Shan

Z
Z

S = k N ln N + (ln ) ai + ai i = k N ln N + (ln + i )ai


N i
N
i
i

From the Boltzmann distribution function(p96):,

g i e
Z
, ln + i = ln g i ln ai ,
ai = N
Z
N
i

So we get:

S = k N ln N + (ln g i ln ai )ai = k ( N ln N ai ln ai + ai ln g i )

i
i
i
Compare it with Eq.7.5a, we prove:

S = k ln

Equation 7-5-C

This is the famous Boltzmann statistical entropy fomula, where the Boltzmans constant,

k , is the bridge connecting the microscopic and the macroscopic worlds. It must have
dimensions of entropy, Joules/Kelvin, and it turns out that the correct numerical
correspondence is given by the gas constant

k=

divided by Avogadro's number:

R
8.314 JK 1mol 1
=
= 1.381 10 23 JK 1
23
N 0 6.023 10 particles / mol

k = 1.381 10 23 JK 1

AP3290

112

Thermodynamics

Y Y Shan

Chapter 8 FermiDirac distribution, and BoseEinstein distribution


8.1 The Fermi-Dirac distribution

N = ai ,

U = ai i

The Fermi-Dirac distribution applies to fermions . Fermions are particles which


have half-integer spin and therefore are constrained by the Pauli exclusion principle.
Fermions incude electrons, protons, neutrons.

FD =
i

gi!
ai ! ( g i ai )!

ai = Nf ( i )

N = ai ,

f ( i ) =

U = ai i

gi
e ( E
i

) / kT

+1

For F-D statistics, the expected number of particles in states with energy i is:
ai = Nf ( i ) , where

f ( i ) =

gi
e ( E
i

) / kT

+1

eq8-a

Is called the The Fermi-Dirac distribution, meaning the probability that a particle
occupying energy level i .

AP3290

113

Thermodynamics

Y Y Shan

8.2 The Bose-Einstein distribution

N = ai ,

U = ai i

Bosons are particles which have integer spin , such as photons , and which
therefore are not constrained by the Pauli exclusion principle . The energy distribution of
bosons is described by Bose-Einstein statistics.

BE =
i

(ai + g i 1)!
ai !( g i 1)!

N = ai ,
i

U = ai i

ai = Nf ( i )

f ( i ) =

gi
Ae

/ kT

For B-E statistics, the expected number of particles in states with energy i is:
ai = Nf ( i ) , where
AP3290

114

Thermodynamics

f ( i ) =

Y Y Shan

gi
Ae / kT 1
i

eq8-b

Is called the Bose-Einstein distribution, meaning the probability that a particle occupying
energy level i .

AP3290

115

You might also like