Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... ·...

43
Lecture Note 20S Supplement to Chapter 20 Entropy Introduction to Statistical Mechanics Micro-canonical ensembles Canonical ensembles 1. Definition of entropy in statistical mechanics In statistical thermodynamics the entropy is defined as being proportional to the logarithm of the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system: S = kB lnW where kB is the Boltzmann's constant and W is the number of microstates corresponding to the observed thermodynamic macrostate. This definition is considered to be the fundamental definition of entropy (as all other definitions can be mathematically derived from it, but not vice versa). In Boltzmann's 1896 Lectures on Gas Theory, he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics. In 1877, Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems. Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic variables, like temperature and volume, the entropy measures the degree to which the probability of the system is spread out over different possible quantum states. The more states available to the system with higher probability, the greater the entropy. More specifically, entropy is a logarithmic measure of the density of states. In essence, the most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system. This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model. 2. Boltzmann’s principle Microcanonical ensemble

Transcript of Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... ·...

Page 1: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Lecture Note 20S

Supplement to Chapter 20

Entropy

Introduction to Statistical Mechanics

Micro-canonical ensembles

Canonical ensembles

1. Definition of entropy in statistical mechanics

In statistical thermodynamics the entropy is defined as being proportional to the

logarithm of the number of microscopic configurations that result in the observed

macroscopic description of the thermodynamic system:

S = kB lnW

where kB is the Boltzmann's constant and W is the number of microstates corresponding

to the observed thermodynamic macrostate. This definition is considered to be the

fundamental definition of entropy (as all other definitions can be mathematically derived

from it, but not vice versa). In Boltzmann's 1896 Lectures on Gas Theory, he showed that

this expression gives a measure of entropy for systems of atoms and molecules in the gas

phase, thus providing a measure for the entropy of classical thermodynamics.

In 1877, Boltzmann visualized a probabilistic way to measure the entropy of an

ensemble of ideal gas particles, in which he defined entropy to be proportional to the

logarithm of the number of microstates such a gas could occupy. Henceforth, the

essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has

been to determine the distribution of a given amount of energy E over N identical systems.

Statistical mechanics explains entropy as the amount of uncertainty (or

"mixedupness" in the phrase of Gibbs) which remains about a system, after its observable

macroscopic properties have been taken into account. For a given set of macroscopic

variables, like temperature and volume, the entropy measures the degree to which the

probability of the system is spread out over different possible quantum states. The more

states available to the system with higher probability, the greater the entropy. More

specifically, entropy is a logarithmic measure of the density of states. In essence, the

most general interpretation of entropy is as a measure of our uncertainty about a system.

The equilibrium state of a system maximizes the entropy because we have lost all

information about the initial conditions except for the conserved variables; maximizing

the entropy maximizes our ignorance about the details of the system. This uncertainty is

not of the everyday subjective kind, but rather the uncertainty inherent to the

experimental method and interpretative model.

2. Boltzmann’s principle

Microcanonical ensemble

Page 2: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

We consider a system in the energy interval between E and E + E, where E is the

order of uncertainty. Each state in this range is assumed to have the same probability.

Thus we have an ensemble characterized by the equal probability wn of each state n, or

wn = w = constant for E<En<E + E,

which expresses the principle of equal probability as it is. The number of microscopic

states accessible to a macroscopic system is called the statistical weight (thermodynamic

weight). The number of states for the system with the energy between E and E + E (the

statistical weight) may be written as

EEEEW )(),(

where (E) is the density of states. This particular ensemble is known as the

microcanonical ensemble.

Here we show that the entropy S in the equilibrium state of the system can be

expressed by

WkS B ln

where W is the number of states.

Suppose that the entropy S is expressed by a function W as

)(WfS

We assume that there are two independent systems (A and B). These systems are in the

macroscopic equilibrium states a and b , respectively. The numbers of states for

these states are given by Wa and Wb. We now consider the combined system (C = A + B).

The number of states for the state c )( ba is given by

Page 3: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Wc=Wa Wb.

The entropy of the system C is

BAC SSS

from the additivity of the entropy. Then we have the relation

)()()( baba WfWfWWf

For simplicity, we put

= Wa, = Wb.

Then the relation is rewritten as

)()()( fff

Taking a derivative of the both sides with respect to ,

)(')(' ff

Taking a derivative of the both sides with respect to ,

)(')(' ff

From these we get

kff )(')('

where k is independent of and . In other words, k should be constant. The solution of

the first-order differential equation is

ln)( kf

When k = kB (Boltzmann constant), the entropy is given by

WkS B ln

3. Microcanonical ensemble for an ideal gas

The basic assumption of statistical mechanics is

all microstates are equally probable.

Page 4: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

We consider the system of N free particles in a container with volume V. Each state of

the system is specified by a point at 6N-dimensional space (3N coordinates, 3N momenta)

(so called, phase space).

(r1, r2,…,rN, p1, p2, … , pN) = (q1, q2, q3,…., q3N, p1, p2,…, p3N)

This state is continuous, but not discrete. For convenience, we assume that the 6N-

dimensional phase space consists of unit cells with the volume of (qp)3N, where the

size of p and q is arbitrary. There is one state per the volume of (qp)3N. Note that

qp= h (Planck’s constant) from the Heisenberg’s principle of uncertainty.

We define the number of states in a volume element dq1dq2dq3….dq3Ndp1dp2dp3N as

NNNdpdpdpdqdqdqdq

pq32133213

.........)(

1

The total energy E is a sum of the kinetic energy of each particle. It is independent of the

3N coordinates {ri}. We calculate the number of states when the total energy is between

E and E + E.

NN

N

NNNdpdpdp

pq

Vdpdpdpdqdqdqdq

pq

EEEEW

321332133213....

)(.........

)(

1

)(),(

where

E ≤ (p12 + p2

2 + ….+ p3N2)/ 2m ≤ E + E

The volume of the momentum space is the spherical shell between the sphere with radius

)(2 EEmR and the sphere of radius mER 2 .

We calculate the number of states between E and E + E as follows. First we

calculate the volume of the 3N dimensional sphere (momentum) with radius R given by

mER 2

The volume of the sphere is obtained as

2/3)2()2/3()3(

2 NmENN

where (x) is the Gamma function of x. The total number of states N(E) for the system

with energy between 0 and E is given by

Page 5: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

2/3

3)2(

)2/3()3(

2

!)(

1)( N

N

NmE

NNN

V

pqEN

where N! is included because N-particles are non-distinguishable. Note that N(E) is

introduced for the convenience of the mathematics since we consider a system having the

energy only between E and E + E. The number of states between E and E + E is

obtained as

E

E

N

mE

N

V

pqE

dE

EdNEEEEW

NN

N

)2/3(

)2(

!)(

1)()(),(

2/3

3

where (E) is the density of states. We use the Heisenberg’s principle of uncertainty;

pq = h (Planck’s constant), which comes from the quantum mechanical requirement.

Then we have

E

EmE

NN

V

E

E

h

mE

NN

VE

dE

EdNEEW

NN

NN

2/3

2

2/3

2

)2

()2/3(!

)2

()2/3(!

)(),(

where ħ = h/(2) (ħ; Dirac constant).

Using the Stirling’s formula

2

3)

2

3ln(

2

3)

2

3(ln

NNNN for N>>1.

)1(ln!ln NNN for N>>1

Then we have

)ln(]2

5)

3ln(

2

3[ln

]2

3)

2

3ln(

2

3[)1(ln)ln()

2ln(

2

3ln),(ln

2

2

E

E

N

mE

N

VN

NNNNN

E

EmENVNEEW

4. Entropy S

The entropy S is defined by

]ln)3

ln(2

3)ln(

2

5[

),(ln

2 E

E

N

EmN

N

VN

Nk

EEWkS

B

B

Page 6: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

In the limit of N →∞

)]3

ln(2

3)ln(

2

5[

2 N

Em

N

VNkS B

So the entropy S is found to be an extensive parameter. The entropy S depends on N, E

and V:

),,( VENSS

dVT

PdE

T

dVV

VENSdE

E

VENSdS

1

),,(),,(

Then we have

TE

EEWk

E

VENSB

1),(ln),,(

and

T

P

V

VENS

),,(

From this relation, E can be derived as a function of N, V, and T.

),,( TVNEE

The heat capacity CV is given by

T

VT

EC

In the above example, we have

ENk

TE

VENSB

1

2

31),,(

or

TNkE B2

3 (internal energy)

Page 7: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

and

VNk

T

P

V

VENSB

1),,(

we have

TNkPV B (Boyle's law)

5. Typical Examples

5.1 Adiabatic free expansion (sudden expansion into a vacuum)

The free expansion is an irreversible process. Nevertheless, we can define the entropy

in the reversible process (the final state and initial states are the same), given by

constTVNk

Tmk

N

VNk

Tkm

N

VNk

N

Em

N

VNkS

B

BB

BB

B

)ln2

3(ln

)]2

ln(2

3)ln(

2

5[

)]2

3

3ln(

2

3)ln(

2

5[

)]3

ln(2

3)ln(

2

5[

2

2

2

since TNkE B2

3 . This expression is exactly the same as that derived previously

(Chapter 20). When the volume changes from V = V1 to V2 at constant T, the change in

entropy ( S ) is obtained as

)ln(1

2

V

VNkS B

5.2 Isentropic process

The entropy remains constant if

constVT )ln( 2/3 .

In an expansion at constant entropy from V1 to V2, we have

2/3

22

2/3

11 TVTV

for an ideal monatomic gas. Since

Page 8: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

2

22

1

11

T

VP

T

VP

we have

2/3

222

2/3

111 )()( VPVVPV or 3/5

22

3/5

11 VPVP

5.3 Mixing entropy of ideal gas

We consider two kinds of identical ideal gases which are separated by a barrier AB.

The temperature T and the pressure P are the same.

TkNPV

TkNPV

B

B

22

11

or

P

Tk

N

V

N

V B2

2

1

1

or

P(V1+V2) = (N1+N2)kBT P

Tk

NN

VV B

21

21

Suppose that the barrier is open suddenly. We discuss what is the change of entropy of

this system, using the expression of the entropy

])ln(2

3)ln(

2

5[ 0 T

N

VNkS B

Before opening the barrier AB, we have

21 SSS i

where

])ln(2

3)ln(

2

5[

])ln(2

3)ln(

2

5[

01

0

1

111

TP

TkkN

TN

VkNS

BB

B

Page 9: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

])ln(2

3)ln(

2

5[

])ln(2

3)ln(

2

5[

02

0

2

222

TP

TkkN

TN

VkNS

BB

B

After opening the barrier AB, the total number of atoms is N1 + N2 and the volume is V1 +

V2. Then we have the final entropy,

])ln(2

3)ln(

2

5[)(

])ln(2

3)ln(

2

5[)(

021

0

21

2121

TP

TkkNN

TNN

VVkNNS

BB

Bf

Since

0)( 21 SSSS f

as is expected (the reversible process), we have no change of entropy in this event.

((Gibbs paradox))

This success is partly due to the N! term in the numerator of W(, E). This is closely

related to the fact that the particles are not distinguishable in our system.

((Micro-canonical ensemble))

So far we consider the micro-canonical ensemble, where the system is isolated from

its surrounding. The energy of the system is conserved. It is assumed that the system of

the energy has E and EE , where E is a small width. The concept of the thermal

reservoir does not appear. Since the energy of the system is constant, the possible

microstates have the same probability. For the energy E, the volume V, and the number of

particles, one can obtain the number of the possible microstates. Then the entropy

),,( VNES can be defined by

),(ln EEWkS B .

6. Canonical ensemble(system with constant temperature)

The theory of the micro-canonical ensemble is useful when the system depends on N,

E, and V. In principle, this method is correct. In real calculations, however, it is not so

easy to calculate the number of states W(E, E) in general case. We have an alternative

method, which is much useful for the calculation in the real systems. The formulation of

the canonical ensemble is a little different from that of the micro-canonical ensemble.

Both of these ensembles lead to the same result for the same macro system.

Canonical ensemble: (N, T, V, constant)

Page 10: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Suppose that the system depends on N, T, and V. A practical method of keeping the

temperature of a system constant is to immerse it in a very large material with a large

heat capacity. If the material is very large, its temperature is not changed even if some

energy is given or taken by the system in contact. Such a heat reservoir serves as a

thermostat.

We consider the case of a small system S(I) in thermal contact with a heat reservoir

(II). The system S(I) is in thermal equilibrium with a reservoir W(II). S(I) and W(II) have

a common temperature T. The system S(I) is a relatively small macroscopic system. The

energy of S(I) is not fixed. It is only the total energy of the combined system.

iIIT EEE

We assume that WII(EII) is the number of states where the thermal bath has the energy EII.

If S(I) is in the one definite state i , the probability of finding the system (I) in the state

i , is proportional to WII(EII). The thermal bath is in one of the many states with the

energy ET - Ei

)()( iTIIIIIIi EEWEWp

or

constEEWp iTIIi )](ln[ln

Since

iT EE

Page 11: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

)](ln[ iTII EEW can be expanded as

iE

II

IIIITIIiTII E

dE

EWdEWEEW

T|

)(ln)(ln)](ln[

Then we obtain

)|)(ln

exp( iE

II

IIIIi E

dE

EWdp

T (1)

Here we notice the definition of entropy and temperature for the reservoir as the micro-

canonical ensemble:

)(ln IIIIBII EWkS

and

IIII

II

TE

S 1

or

IIBII

II

BII

IIII

TkE

S

kdE

EWd 11)(ln

In thermal equilibrium, we have

TTT iII

Then Eq.(1) can be rewritten as

)exp()exp( i

B

ii E

Tk

Ep

where = 1/(kBT). This is called a Boltzmann factor. We define the partition function Z

as

i

EieZ

The probability is expressed by

Page 12: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

iE

i eZ

p

1

The summation in Z is over all states i of the system. We note that

1)( i

iEp

The average energy of the system is given by

ZeE

ZE

i

E

ii

ln1

since

i

E

i

i

E

iii eE

ZeE

Z

Z

Z

Z

1

)(11ln

Note that

T

ZTk

T

Z

T

ZE B

lnln1ln 2

.

In summary, the representative points of the system I are distributed with the

probability density proportional to exp(-Ei). This is called the canonical ensemble, and

this distribution of representative points is called the canonical distribution. The factor

exp(-Ei) is often referred to as the Boltzmann factor.

7. Pressure

The pressure P is defined as

V

Z

V

Z

Ze

V

E

Ze

ZPP

i

EiE

i

iii

ln111)(

11

Here we define the Helmholtz free energy F as

STEF .

SdTPdV

TdSSdTPdVTdS

TdSSdTdEdF

F is a function of T and V. From this equation, we have

Page 13: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

T

V

V

FP

T

FS

8. Helmholtz free energy and entropy

The Helmholtz free energy F is given by

ZTkF B ln

((Proof))

We note that

ZT

kT

E

T

FST

T

FT

FT

T

F

TB ln)(

222

,

which leads to

ZTkF B ln

What is the expression for the entropy in a canonical ensemble? The entropy is given by

T

FUS

where U is the average energy of the system,

Z

Uln

Then entropy S is rewritten as

Page 14: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

i

iiB

Bi

i

iB

Bi

i

iB

B

E

i

iB

B

i

E

i

B

ppk

ZkpZpk

ZkpEk

ZkZ

eEk

ZkeEZT

ZkZ

TS

i

i

ln

ln)lnln(

ln

ln

ln11

lnln1

or

i

iiB ppkS ln ,

where pi is that the probability of the i state and is given by

iE

i eZ

p

1

The logarithm of pi is described by

ZEp ii lnln

((Note))

We finally get a useful expression for the entropy which can be available for the

information theory.

i

iiB ppkS ln .

Suppose that there are 50 boxes. There is one jewel in one of 50 boxes. pi is the

probability of finding one jewel in the i-th box for one trial.

(a) There is no hint where the jewel is.

p1 = p2 = ….. = p50=50

1

Page 15: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

50

1

91.3)50

1ln(

50

1ln

s

BB

s

ssB kkppkS

(b) There is a hint that the jewel is in one of the box with even number.

p1 = p3 = ….. = p49=0

p2 = p4 = ….. = p50=25

1

evens

BB

s

ssB kkppkS 219.3)25

1ln(

25

1ln

(c) If you know that the jewel is in the 10-th box,

p10=1

ps = 0 (s ≠ 10)

0ln 1010 ppkS B

If you know more information, the information entropy becomes smaller.

9. Application

9.1. Partition function Z

The partition function Z for the ideal gas can be calculated as

2/3

2

32

3)

2(

!])

2exp([

!

NB

NN

B

N

N

h

Tmk

N

V

Tmk

pdp

hN

VZ

Using this expression of Z, the Helmholtz free energy F can be calculated as

]1)2

ln(2

3)ln([

2

h

Tmk

N

VTNkF B

B

.

The internal energy E is

TNkZ

U B2

3ln

.

The heat capacity CV at constant volume is given by

BVV NkT

UC

2

3)(

.

Page 16: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

The entropy S is

]2

5)

2ln(

2

3)[ln(

2

ℏTmk

N

VNk

T

FUS B

B .

The pressure P is

V

TNk

V

FP B

T

)( .

9.2 Maxwell’s distribution function

The Maxwell distribution function can be derived as follows.

dvvTk

mvAdvvfdvvvndn

B

22

2 4)2

exp()(4)()( vv

The normalization condition:

1)(4)()( 2 dvvfdvvvndn vv

The constant A is calculated as

2/3

2

Tk

mA

B.

Then we have

)2

exp(2

4)(2

2

2/3

Tk

mvv

Tk

mvf

BB

.

Since M = m NA and R = NA kB, we have

)2

exp(42

)(2

2

2/3

RT

Mvv

RT

Mvf

which agrees with the expression of f(v) in Chapter 19.

((Mathematica))

Page 17: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

f1 = IntegrateBExpB−m v2

2 kB TF 4 π v2, 8v, 0, ∞<,

GenerateConditions −> FalseF

2 2 π3ê2

J m

kB TN3ê2

eq1 = A f1 � 1;

Solve@eq1, AD

::A →

J m

kB TN3ê2

2 2 π3ê2>>

10. Comparison of the expression of S in the canonical ensemble with the

original definition of S in the microcanonical ensemble

The partition function Z can be written as

deeZ

i

Ei )(

where we use instead of E (or Ej) in the expression. The partition function Z is the

Laplace transform of the density of states, )( . Here we show that

)exp()(2)()( *** ddeZ

If the function () is given by

e)()( .

We assume that () can be approximated by a Gaussian function

])(2

)(exp[)()(

2*

2**

where

)exp()()( ***

Page 18: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Fig. () vs . () has a Gaussian distribution with the width around

= * = E.

The function () has a local maximum at E * . The logarithm of )( is

rewritten as

)](ln[

)]exp()(ln[)(ln

We take the derivative of )(ln with respect to ,

d

d )(ln

)(

)('

)(

)('

where

*|)(ln

d

d

since

0)(' * .

Here we define the number of states ),( W by

Page 19: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

)(2)(),( *** W

Then we have

*|

),(ln

d

Wd (1)

since

** |)(ln

|),(ln

ln)(ln),(ln *

W

W

with fixed . We note that

)exp()(2

2)(])(2

)(exp[)()(

***

**

0

2*

2**

ddZ

Then we have

*** )](2ln[ln Z

The entropy S is calculated as

),(ln

)](2ln[

)](2ln[

lnln1

**

***

*

Wk

k

Tk

T

ZkZ

TS

B

B

B

B

(2)

Using Eqs.(1) and (2), we get

Tk

E

SB

1

or

T

W 1),(ln

Page 20: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

11. Boltzmann-Planck’s method

Finally we show the standard method of the derivation, which characterizes well the

theory of canonical ensembles.

We consider the way of distributing M total ensembles among states with energies Ej.

Let Mj be the number of ensembles in the energy level Ej; M1 ensembles for the energy E1,

the M2 ensembles for the energy E2, and so on. The number of ways of distributing M

ensembles is given by

!...!

!

21 MM

MW

where

MMj

j

and the average energy E is given by

j

j

jM

MEE

The entropy S is proportional to lnW,

Page 21: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

j

jMMW )!ln(!lnln

Using the Stirling's formula

j

jj

j

jj

MMMM

MMMMW

lnln

)1(ln)1(lnln

in the limit of large M and Mj. We note that the probability of finding the state j is

simply given by

M

MEP

j

j )(

Then we have

j

jj

j

jj

j

jj

j

jj

EPEP

MEPEPM

EMPEMPM

M

MMM

MWM

)](ln[)(

]ln)(ln[)(ln

)](ln[)(1

ln

ln1

lnln1

which is subject to the conditions

EEPE

EP

j

jj

j

j

)(

1)(

.

Treating P(Ej) as continuous variables, we have the variational equation

0)}()1()({ln

)()()(ln)([

j

jjj

j

jjjjj

EPEEP

EPEEPEPEP

Page 22: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

which gives P(Ej) for the maximum W. Here and are Lagrange’s indeterminate

multipliers. Thus we obtain

0)1()(ln jj EEP

or

]exp[)( jj ECEP

or

)exp()(

1)( jj E

ZEP

where

j

jEZ )exp()(

and

= 1/kBT.

With the above )( jEP , the entropy S is expressed by

j

jjB

B

EPEPMk

WkS

)](ln[)(

ln

for the total system composed of M ensembles. Therefore, the entropy of each ensemble

is

j

jjB EPEPkS )](ln[)(

12. Density of states for quantum box (ideal gas)

(a) Energy levels in 1D system

We consider a free electron gas in 1D system. The Schrödinger equation is given by

)()(

2)(

2)(

2

222

xdx

xd

mx

m

pxH kk

kkk

ℏ, (1)

Page 23: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

where

dx

d

ipℏ

,

and k is the energy of the particle in the orbital.

The orbital is defined as a solution of the wave equation for a system of only one

electron:one-electron problem. Using a periodic boundary condition: )()( xLx kk , we have

ikx

k ex ~)( , (2)

with

22

22 2

22

nLm

km

k

ℏℏ,

1ikLe or nL

k2

,

where n = 0, ±1, ±2,…, and L is the size of the system.

(b) Energy levels in 3D system

We consider the Schrödinger equation of an electron confined to a cube of edge L.

kkkkk

p 2

22

22 mmH

ℏ. (3)

It is convenient to introduce wavefunctions that satisfy periodic boundary conditions.

Boundary condition (Born-von Karman boundary conditions).

),,(),,( zyxzyLx kk ,

),,(),,( zyxzLyx kk ,

),,(),,( zyxLzyx kk .

The wavefunctions are of the form of a traveling plane wave.

rk

k r ie)( , (4)

with

Page 24: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

kx = (2/L) nx, (nx = 0, ±1, ±2, ±3,…..),

ky = (2/L) ny, (ny = 0, ±1, ±2, ±3,…..),

kz = (2/L) nz, (nz = 0, ±1, ±2, ±3,…..).

The components of the wavevector k are the quantum numbers, along with the quantum

number ms of the spin direction. The energy eigenvalue is

22

2222

2)(

2)( kk

mkkk

mzyx

ℏℏ . (5)

Here

)()()( rkrrp k kkki

ℏℏ

. (6)

So that the plane wave function )(rk is an eigen-function of p with the eigenvalue kℏ .

The ground state of a system of N electrons, the occupied orbitals are represented as a

point inside a sphere in k-space.

(c) Density of states

Because we assume that the electrons are non-interacting, we can build up the N-

electron ground state by placing electrons into the allowed one-electron levels we have

just found. The one-electron levels are specified by the wave-vectors k and by the

projection of the electron’s spin along an arbitrary axis, which can take either of the two

values ±ħ/2. Therefore associated with each allowed wave vector k are two levels:

,k , ,k .

Page 25: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Fig. Density of states in the 3D k-space. There is one state per (2/L)3.

There is one state per volume of k-space (2/L)3. We consider the number of one-

electron levels in the energy range from to +d; D()d

dkkL

dD 2

3

3

42

)(

, (13)

where D() is called a density of states.

13. Application of canonical ensemble for ideal gas

(b) Partition function for the system with one atom; 1Z

The partition function Z1 is given by

Page 26: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

2/3

2

22

2

3

22

3

22

8

)2

exp(4)2(

)2

exp()2(

)2

exp(

CV

km

dkkV

md

V

mZ

k

kk

k

where 3LV ,

mC

2

2ℏ

, TkB

1

((Mathematica))

Clear "Global` " ;

f1V

2 34 k

2Exp C1 k

2;

Integrate f1, k, 0,

Simplify , C1 0 &

V

8 C13 2 3 2

Then the partition function 1Z can be rewritten as

VnTmk

V

Tmk

V

Tmk

VZ Q

B

BB

2/3

22/32

2/32

2

122

28

ℏℏℏ

where Qn is a quantum concentration and is defined by

2/3

22

ℏTmk

n BQ .

Page 27: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

nQ is the concentration associated with one atom in a cube of side equal to the thermal

average de Broglie wavelength.

Fig. Definition of quantum concentration. The de Broglie wavelength is on the order

of interatomic distance.

ℏ2

h

vmp ,

vm

2 ,

where v is the average thermal velocity of atoms. Using the equipartition law, we get

the relation

Tkvm B2

3

2

1 2 , or

m

Tkv B3

,

Then we have

3/13/1

2 11447.1

2

3

2

3

2

3

22

QQBBB nnTmkTmk

m

Tkm

vm

ℏℏℏℏ

where

2/3

22

ℏTmk

n BQ

Page 28: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

It follows that

3

1

Qn .

((Definition))

1Qn

n → classical regime

An ideal gas is defined as a gas of non-interacting atoms in the classical regime.

((Example)) 4He gas at P = 1 atm and T = 300 K, the concentration n is evaluated as

1910446.2 Tk

P

V

Nn

B

/cm3.

The quantum concentration nQ is calculated as

24108122.7 Qn /cm3

which means that Qnn in the classical regime. Note that the mass of 4He is given by

24106422.64 um g.

where u is the atomic unit mass.

((Mathematica))

Page 29: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Clear "Global` " ;

rule1 kB 1.3806504 1016,

NA 6.02214179 1023,

1.054571628 1027,

amu 1.660538782 1024,

atm 1.01325 106;

T1 300; P1 1 atm . rule1;

m1 4 amu . rule1

6.64216 1024

nQm1 kB T1

2 2

3 2

. rule1

7.81219 1024

n1P1

kB T1. rule1

2.44631 1019

(b) Partition function of the system with N atoms

Suppose that the gas contains N atoms in a volume V. The partition function ZN,

which takes into account of indistinguishability of the atoms (divided by the factor N!), is

given by

!

1

N

ZZ

N

N .

Using VnZ Q1 , we get

]ln1)[ln(

!ln)ln(ln

NVnN

NVnNZ

Q

QN

where we use the Stirling’s formula

)1(lnln! NNNNNN ,

Page 30: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

in the limit of large N. The Helmholtz free energy is given by

]1)2

ln(2

3ln

2

3[ln

)ln(

]1)[ln(

]ln1)[ln(

ln

2

ℏB

B

B

Q

B

Q

B

QB

NB

mkT

N

VTNk

TNkn

nTNk

N

VnTNk

NVnTNk

ZTkF

since

12

ln2

3ln

2

3ln]

2ln[)ln(

2

2/3

2

ℏℏ

BBQ mkT

N

V

N

VTmk

N

Vn

The entropy S is obtained as

)ln2

3(ln

]2

5)

2ln(

2

3ln

2

3[ln

0

2

TN

VNk

mkT

N

VNk

T

FS

B

BB

V

where

)2

ln(2

3

2

520ℏ

Bmk

Note that S can be rewritten as

B

Q

B Nkn

nNkS

2

5ln (Sackur-Tetrode equation)

((Sackur-Tetrode equation))

The Sackur–Tetrode equation is named for Hugo Martin Tetrode (1895–1931) and

Otto Sackur (1880–1914), who developed it independently as a solution of Boltzmann's

gas statistics and entropy equations, at about the same time in 1912.

https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation

Page 31: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

In the classical region ( 1Qn

n or 1

n

nQ), we have

0ln n

nQ

The internal energy E is given by

TNk

TNkn

nTNkTNk

n

nTNk

STFE

B

B

Q

BB

Q

B

2

3

2

5ln)ln(

Note that E depends only on T for the ideal gas (Joule’s law). The factor 3/2 arises from

the exponent of T in Qn because the gas is in 3D. If Qn were in 1D or 2D, the factor

would be 1/2 or 2, respectively.

(c) Pressure P

The pressure P is defined by

V

TNk

V

FP B

T

leading to the Boyle’s law. Then PV is

3

2ETNkPV B (Bernoulli’s equation)

(d) Heat capacity

The heat capacity at fixed volume V is given by

B

V

V NkT

STC

2

3

.

When ANN , we have

RCV2

3 .

Cp is the heat capacity at constant P. Since

Page 32: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

PdVTdSdU

or

PdVdUTdS

then we get

ppp

pT

VP

T

U

T

STC

RkNP

kNP

T

VP BA

BA

p

We note that

TkNE BA2

3 .

E is independent of P and V, and depends only on T. (Joule’s law)

VBA

Vp

CkNT

U

T

U

2

3

Thus we have

RRRRCC VP2

5

2

3 .

((Mayer’s relation))

RCC VP for ideal gas with 1 mole.

The ratio is defined by

3

5

V

P

C

C .

(e) Isentropic process (constant entropy)

The entropy S is given by

]ln)[ln()ln2

3(ln 0

2/3

0 NVTNkTN

VNkS BB

Page 33: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

The isentropic process is described by

2/3VT =const, or 3/2TV =const,

Using the Boyle’s law ( RTPV ), we get

3/2VR

PV=const,, or 3/5PV =const

Since = 5/3, we get the relation

PV = constant

14. The expression of entropy: )(ln EWkS B

The entropy is related to the number of states. It is in particular, closely related to

Wln . In order to find such a relation, we start with the partition function

dEfNEEdEWZ ]exp[)exp()()exp(

where )(EW is the number of states with the energy E. The function )(Ef is defined by

N

EWTkE

N

EWEEf B )(ln)(ln

)(

In the large limit of N, )(Ef is expanded using the Taylor expansion, as

...)(|)(

)()( ***

EEE

EfEfEf

EE

where

0])(ln

1[1)(

E

EWTk

NE

EfB at *EE

or

*|)(ln1

EEBE

EWk

T

At at *EE ,

)](exp[ *EfNZ

Page 34: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

For simplicity, we use E instead of *E . The Helmholtz free energy F is dsefined by

)()]([ln ENfEfNTkZTkF BB

or

STEEWTkEF B )(ln

leading to the expression of the entropy S as

)(ln EWkS B ,

and

E

S

T

1

.

15. The expression of entropy:

ppkS B ln

We consider the probability given by

Ee

Zp

1

,

where

EeZ ,

EZp lnln ,

The energy E is given by

Z

Z

Z

eEZ

EpU

E

ln

1

1

Page 35: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

The entropy is a logarithmic measure of the number of states with significant probability

of being occupied. The Helmholtz energy F is defined by

ZTkSTUF B ln .

The entropy S is obtained as

T

UZk

T

FUS B

ln

We note that

S

T

FU

ZT

Tk

T

U

ZUk

Zpkppk

B

B

BB

ln

)ln(

)ln(ln

Thus it follows that the entropy S is given by

ppkS B ln .

16. Thermal average of energy fluctuation

2

2

222 )(11

EEeE

ZeE

ZEE

][1

])([11

]1

[1

1

22

2

2

22

2

2

EETk

eEd

dZeEZ

ZTk

eEZd

d

Tk

Ed

d

Tk

Ed

d

dT

dE

dT

d

B

EE

B

E

B

B

where

Page 36: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

EZeEd

dZ E

Then we have

][1 22

2EE

TkE

dT

d

B

Since VCEdT

d , we get the relation

][)(

1 22

2EE

Tkk

C

BB

V

17. Example: 4He atom as ideal gas

We consider the 4He atom with mass

um 4 = 6.64216 x 10-24 g

The number density n at at P = 1 atm and T = 300 K, is

n = 2.44631 x 1019/cm3

The number of atoms in the volume V = 103 cm3 is

N = nV = 2.44631 x 1022

The internal energy

987.1512

3 TNkE B J

The entropy S

]2

5)

2ln(

2

3ln

2

3[ln

2

ℏB

B

mkT

N

VNkS = 5.125 J/K.

((Mathematica))

Page 37: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Clear "Global` " ;

rule1 kB 1.3806504 1016,

NA 6.02214179 1023,

1.054571628 1027,

amu 1.660538782 1024,

atm 1.01325 106, bar 10

6, J 10

7;

n1 2.44631 1019; V1 10

3; T1 300;

N1 n1 V1

2.44631 1022

m1 4 amu . rule1

6.64216 1024

P1kB N1

V1T1 . rule1

1.01325 106

Page 38: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

P1 bar . rule1

1.01325

E13

2kB N1 T1 . rule1

1.51987 109

E1 J . rule1

151.987

S1 kB N13

2Log T1 Log

V1

N1

3

2Log

m1 kB

2 2

5

2. rule1

5.12503 107

S1 J . rule1

5.12503

17. Link

Entropy (Wikipedia)

http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)

_______________________________________________________________________

REFERENCES

M. Toda, R. Kubo, and N. Saito, Statistical Physics I: Equilibrium Statistical (Springer,

1983).

L.D. Landau and E.M. Lifshitz, Statistical Physics (Pergamon Press, 1980).

R.P. Feynman, Statistical Mechanics A Set of Lectures (The Benjamin/Cummings, 1982).

E. Fermi, Thermodynamics (Dover, 1956).

S.J. Blundell and K.M. Blundel, Concepts in Thermal Physics (Oxford, 2006).

C. Kittel and H. Kroemer, Thermal Physics (W.H. Freeman and Company (1980).

C. Kittel, Elementary Statistical Physics (Wiley & Sons, 1958).

P. Atkins, The Laws of Thermodynamics: A Very Short Introduction (Oxford, 2010).

D.V. Schroeder, An Introduction to Thermal Physics (Addison-Wesley, 2000).

R. Reif, Fundamentals of Statistical and Thermal Physics (McGraw-Hill, 1965).

Page 39: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

F. Bloch, Fundamentals of Statistical Mechanics (World Scientific, 2000).

D. ter Haar, Elements of Statistical Mechanics (Butterworth Heinemann, 1995).

F. Schwabl, Statistical Mechanics, 2nd edition (Springer, 2006).

J.W. Halley, Statistical Mechanics (Cambridge, 2007).

H. Kawamura, Statistical Physics (Maruzen, 1997) [in Japanese]

D. Yoshioka, Statistical Physics, An Introduction (Springer, 2007).

________________________________________________________________________

APPENDIX-I

Density of states for N particles for micro-canonical ensemble

Using the density of states for the 1-particle system, the density of states for the 2-

particle system is estimated as

)2(1

)()()(

2

2

2

1

0

2

2

2

0

1121

2

0

11211122

2

2

fECdxxxEC

dEC

dEDDED

E

E

where 2 is the total energy of two-particle system, and

)2

3(

)]1(2

3[)

2

3(

)1()(

1

0

12/)1(32/1

n

n

dxxxnf n

Similarly, the density of states for the 3-particle system is

)3()2(

1

)2(

)()()(

2/7

3

3

1

0

22/7

3

3

0

1

2

131

3

0

11321133

3

3

ffEC

dxxxEC

dEfC

dEDDED

E

E

where 3 is the total energy of three particles.

Suppose that Dn has the form

Page 40: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

2/12/)1(3)()....3()2()(

n

n

n

nn EnfffCED

then 1n can be described as

2/12/3

1

1

0

1

2/12/)1(3

11111

)1()()...3()2(

)()()...3()2()(1

n

n

n

E

n

n

n

nn

EnfnfffC

dECnfffCEDn

Hence the density of states for the N-particles system is given by

2/12/)1(3)()....3()2()( NN

N ENfffCED

where E = EN

)2

3(

)]2

3([

)2

3(

)]1(2

3[)

2

3(

)1(2

3(

)]2(2

3[)

2

3(

...

)2

12(

]2

9[)

2

3(

)2

9(

)]3[)2

3(

)3(

]2

3[)

2

3(

)()....3()2(

N

N

N

N

NNfff

N

One can get

EE

Nm

V

EE

Nm

V

EN

CED

NN

N

NN

N

N

N

N

N

N

N

N

N

N

1

)2

3(

)2(2

1

)4(

1

)2

3(

)]2

3([

)2()4(

)2

3(

)]2

3([

)(

2/32/

2/3

32

2/32/3

32

2/12/)1(3

where

2)

2

3(

Page 41: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

The number of states whose energy lies between E and + E is given by

E

EE

Nm

V

N

EEDN

EEW

NN

N

NN

N

N

2/32/

2/3

32

)2

3(

)2(2

1

)4(!

1

)(!

1),(

where N! is included because N-particles are non-distinguishable.

This can be rewritten as

E

EmE

NN

VEEW N

N

2/3

2)

2(

)2/3(!),(

ℏ ,

which is exactly the same as the expression derived from the classical approach.

APPENDIX-I: Surface and volume of n-dimensional sphere

We use the formula of integral,

2exp( )x dx

2 2 2 /2

1 2 1 2exp[ ( .... ) .... n

n nx x x dx dx dx

(1)

Suppose that

1

1 2.... n

n ndx dx dx S r dr

where nS is the surface area of the system with n-dimension (which will be determined

later). Thus Eq.(1) can be rewritten as

2 1 /2

0

exp( ) n n

nS r r dr

We now calculate the integral

2 1

0

exp( ) n

nI r r dr

Page 42: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

We introduce a new variable u such that 1/2r u . Since 1/21

2dr u du , we get

12

0

1 1exp( ) ( )

2 2 2

n

n

nI u u du

from the definition of Gamma function;

1

0

( )x se x dx s

Using this relation, we have

/22

2

n

nSn

The volume of n-dimensional hyper-sphere with radius R is obtained as

/2 /2

1

0

1 2

12 2

R n nn n n n

n n nV S r dr S R R Rn nn

n

or

/2

12

nn

nV Rn

((Mathematica))

Page 43: Lecture Note 20S Supplement to Chapter 20bingweb.binghamton.edu/~suzuki/ThermoStatFIles/1.6... · 2019. 11. 9. · ensemble of ideal gas particles, in which he defined entropy to

Clear "Global` " ;

S n :2 n 2

Gamman

2

;

V n :

n 2

Gamman

21

Rn;

Prepend Table n, S n , V n , n, 1, 10 ,

"n", " S n ", "V n " TableForm

n S n V n

1 2 2 R

2 2 R2

3 44 R

3

3

4 22

2R4

2

58

2

3

82R5

15

63

3R6

6

716

3

15

163R7

105

84

3

4R8

24

932

4

105

324R9

945

105

12

5R10

120