You are on page 1of 4

September 17, 2001

Reading: Chapter Four


Homework: 4.1,4.2,4.3,4.4
The constant :
As we are considering a closed system, i.e., N = ni is fixed, with a constant
i

volume, i.e. energy levels, i, are fixed. For the most probable distribution,
N
ni = exp( i ) .
Z
Now, suppose we add some heat into the system so that the total energy of the
system increases. The only way to achieve this is by moving some particles from
the lower energy levels to the higher levels. So, we must change the occupation
numbers of energy levels, ni. On the other hand, the only way to change ni is to
change parameter . Thus, must be related to temperature, which is expected
to be affected by heat input.

ni +1
.
ni
If temperature is increased from T1 to T2, then particles are promoted from the i-th
to the (i+1)-th level. Thus, we have
ni +1 (T2 ) ni +1 (T1 )
>
.
ni (T2 )
ni (T1 )
Alternatively, we can promote the same number of particles from the i-th to the
(i+1)-th level by changing parameter from 1 to 2, i.e.,
ni +1 ( 2 ) ni +1 ( 1 )
>
.
ni ( 2 )
ni ( 1 )
On the other hand,
ni +1 ( 2 ) exp( 2 i +1 )
=
= exp[ 2 ( i +1 i )]
exp( 2 i )
ni ( 2 )
ni +1 ( 1 ) exp( 1 i +1 )
=
= exp[ 1 ( i +1 i )]
exp( 1 i )
ni ( 1 )
ni +1 ( 2 ) ni +1 ( 1 )
>
,
Thus, in order for
ni ( 2 )
ni ( 1 )
We must have
exp[ 2 ( i +1 i ) > exp[ 1 ( i +1 i ) ,
Consider the ratio of number of particles in the i-th and (i+1)-th levels,

And hence

2 < 1 , as ( i +1 i ) > 0 .

So, for promoting the same number of particles from the i-th to the (i+1)-th level,
it is either increasing temperature from T1 to T2, or decreasing parameter from
1 to 2. This indicates must be an inverse function of T.
Boltzmann postulates that = 1 / k B T , where kB is the Boltzmann constant.

lower , higher T
Higher , lowere T

ni
More about the most probable distribution:

The most probable distribution is given by


N
ni = exp( i ) , where = 1 / k B T
Z
with the maximum number of microstates:
N!
=
n0 !n1!n 2 ! ni 1!ni !ni +1!
Now consider three adjacent energy levels, i-1, i, and i+1,
N
ni 1 = exp( i 1 )
Z
N
ni = exp( i )
Z
N
ni 1 = exp( i +1 ) .
Z
Assuming that the energy levels are equally spaced, i.e., i i 1 = i +1 i = .
Now, if we generate a new distribution by removing two particles from level i and
putting them one into the level i-1 and one into the level i+1, then the total energy
of the system remains the same (unchanged). So, for this new distribution,
n'i 1 = ni 1 + 1 ,

n ' i = ni 2 ,

and the number of microstates is


N!
' =
n0 !n1!n 2 ! n' i 1 !n' i !n' i +1 !

n'i +1 = ni +1 + 1 ,

n !n !n !
ni (ni 1)
ni
'
= i 1 i i +1 =

; (ni >> 1)
n' i 1 !n' i !n'i +1 ! (ni 1 + 1)(ni +1 + 1) ni 1 ni +1
2

So,

Since

N
exp( i 1 )
Z
N
N
ni = exp( i ) = exp[ ( i 1 + )] = ni 1 exp( )
Z
Z
N
N
ni +1 = exp( i ) = exp[ ( i 1 + 2 )] = ni 1 exp(2 )
Z
Z
ni 1 =

ni
'
exp(2 )

=
=1
ni 1 ni +1 exp(2 )
2

We have

Thus, we show that when making a slight change away from the most probable
distribution, the number of microstates in the new distribution is about the same
as the in the most probable distribution. Consequently, the total number of
microstates of all distributions must be much larger than that of the single most
probable distribution. That is
total = j = max + '+ others >>> max
j

The fact that total >>> max indicates that there are very many distributions that
are close to the most probable distribution. The system is constantly in a state of
flux --- changing continuously from one distribution to another, but (nearly)
always stays close to the most probable distribution at equilibrium.
Replacing lntotal with lnmax despite total >>>max:
total = j = max + others
j

where j denotes number of microstates in j-th distribution, max denotes the


maximum number of microstates in the most probable distribution. Our interest is
lntotal, which is used to calculate entropy S. However, we only know how to
calculate the max corresponding to the most probable distribution. Fortunately,
despite the fact that total >>> max, it is still true that ln total ln max .
Therefore, we can use lnmax to replace lntotal in calculating the entropy.
For example, suppose max = exp[exp(30)] and total = exp(30) exp[exp(30)] .
So, total >>> max . (also, max << others = total max ). However,

ln max = exp(30) and ln total = 30 + exp(30) . Clearly, ln total ln max .

Note: There are several wrong statements in the textbook on page 77, which
indicates max >> others so that total max . Both these statements are

wrong! In fact, from the above analysis we learn that max is not much larger
than others but much smaller than

others

or total, because there are many

other distributions very close to the most probable distribution with about the
same number of microstates. However, ln total ln max that is we need.
So,

ln total ln max = N ln N N ni ln ni + ni

Or

ln = N ln N ni ln ni (For simplicity, drop subscript max.)

Also,

ln ni = ln N ln Z i
ln = N ln N ni (ln N ln Z i )
= N ln N N ln N + N ln Z + ni i

So,

ln = N ln Z + U = N ln Z + U / k B T .

You might also like