NowhereLog

Thermal and Statistical Mechanics

August 15, 2023

This post gives a gentle introduction to thermal and statistical mechanics, including mysterious concepts like entropy, Boltzmann and quantum statistics, and Bose-Eeinstien condensation.

Entropy

The journey of the thermal and statistical mechanics starts with entropy, one of the most well-known and misunderstood concepts in Physics. Start from the basic definitions, we will see that it is actually very intuitive and can be explained with simple combinatorics.

Thermal and statistical mechanics studies complex systems of particles. Let’s first imagine that our system is consiting of coin-like particles, which have states of either heads or tails. If you find it too ridiculous, just think about electrons with spin up and down. Supppose that our system has NN such particles. The particles have identical properties, but if we virtually give each particle a label, there are 2N2^N possible states, which we call microstates. But since the particles are identical, many such states are indistinguishable. Actually, there are only 101 mascrostates, which is determined by the number of particles with heads, or equivalently by the number of particles with tails. Let’s denote the number of particles with heads as nn, and we denote the number of microstates in a specific macrostate as Ω(N,n)\Omega(N, n), or the multiplicity of the macrostate. In this case, Ω(N,n)=(Nn)\Omega(N, n) = {N \choose n}. The sum of the multiplicities should always equal the total number of microstates, in this case, n=0NΩ(N,n)=n=0N(Nn)=2N\sum_n=0^N \Omega(N, n) = \sum_{n=0}^N {N \choose n} = 2^N.

Now here comes the the fundamental assumption of statistical mechanics:

In an isolated system in thermal equilibrium, all accessible microstates are equally probable.

In classical mechanics, physical interactions are deterministic, and we have a clockwork world. Quantum mechanics subverts this world view, and theorizes that many properties of particles are uncertain before measurements. Statistical mechancis forgoes determining the interactions of individual particles, because the systems of interest are too complex. Even though the fundamental assumption is probabilistic in nature, it is different from the probability wave in quantum mechanics (which comes after statistical mechanics).

The implication of this simple and intuitive assumption is profound. Since each microstate is equally probable, the macrostate with the greatest multiplicity is the most likely to occur, or “the multiplicity tends to increase”. Now we introduce the definition of entropy SS,

S=klogΩS = k \log \Omega

where kk is the Boltzmann constant.

The “law of the increase of multiplicity” now becomes the “law of the increase of entropy”, one version of the well known second law of thermodynamics, which is actually just a strong statement about probabilities.

The second law of thermodynamics: Any large system equalibrium will be found in the state with largest multiplicity (largest entropy).

From here, we will consider different types of interactions between systems and derive familiar concepts like temperature and pressure.

Temperature

The concept of temperature arises when we consider thermal interactions, or the flow of heat between objects. We find that heat flows from objects with high temperature to objects with low temperature, and we will introduce a statistical mechanics definition in accordance to this behavior.

Suppose we have two systems AA and BB, with entropy SAS_A and SBS_B, energy UAU_A and UBU_B. Let the total entropy be S=SA+SBS = S_A+S_B. We will assume that only thermal interactions occur, and specifically, the total number of particles in each system and the volumes of the systems are constants. According to the second law, at the thermal equalibrium, the multiplicity or entropy is maximized, or

SUA=(SA+SB)UA=0    SAUA=SBUA\frac{\partial S}{\partial U_A} = \frac{\partial(S_A + S_B)}{\partial U_A} = 0 \implies \frac{\partial S_A}{\partial U_A} = -\frac{\partial S_B}{\partial U_A}

Finally, since energy is conserved, U=UA+UBU = U_A + U_B is cosntant, and dUA=dUBd U_A = -d U_B, so

SAUA=SBUB\frac{\partial S_A}{\partial U_A} = \frac{\partial S_B}{\partial U_B}

This suggests that temperature should somehow relates to SU\frac{\partial S}{\partial U}. In fact, the Boltzmann constant has the unit J/KJ / K, so SU\frac{\partial S}{\partial U} has the unit 1/K1/K. This suggests the definition of temperature TT as

1T=(SU)N,V\frac{1}{T} = \Big(\frac{\partial S}{\partial U}\Big)_{N, V}

where the subscripts N,VN, V suggests that the number of particles and the volume are constants. In other words, heat tends to transfer to objects with larger SU\frac{\partial S}{\partial U} or lower temperature.

Pressure

The concept of pressure arises when we consider mechanical interactions, or the change of volumes of two systems. We would expect systems with high pressure will tend to expand, or “receive volume” from systems with lower pressure. Suppose again we have two systems AA and BB, with entropy SAS_A and SBS_B, volumes VAV_A and VBV_B. Similar to thermal interactions, we have that at the equalibrium

SAVA=SBVB\frac{\partial S_A}{\partial V_A} = \frac{\partial S_B}{\partial V_B}

We can again play with dimensions of different quantities to realize that a reasonable combination to produce pressure is

P=T(SU)U,NP = T \Big(\frac{\partial S}{\partial U}\Big)_{U, N}

We can actually show that this is consistent with the usual definition of pressure.

Chemical Potential

Chemical potential is a less well-known concept and arises when we consider diffusions, or the movement of particles between systems. We define chemical potential μ\mu such that particles will flow from systems with larger chemical potential to systems with small potential.

μ=T(SN)U,V\mu = -T \Big(\frac{\partial S}{\partial N}\Big)_{U, V}

The name chemical potential can be somewhat confusing since μ\mu has the unit of energy.

The Thermodynamic Identity

Now we’ve introduced all the quantities relevant to entropy, we have the following identity:

dS=1TdU+PdV+μdNdS = \frac{1}{T}dU + PdV + \mu dN

and we can rearrange to derive the thermodynamic identity:

dU=TdSPdV+μdNdU = TdS - PdV + \mu dN

which summarizes all the physics we’ve entroduced so far.

Boltzmann Statistics

We’ve started with entropy, and connects it with usual physical properties. We will now use entropy or multiplicity to derive probability of microstates, and then connect these probabilities with the usual physical properties.

For now, let’s consider a system interacting with a reservoir. We assume that the change in temperature and volume of the reservoir is neglegible, and the system does not exchange particles with the reservoir. We use Ωr(si)\Omega_r(s_i) to denote the multiplicity of the reservoir when the system is at state sis_i. Then we have that for two states si,sjs_i, s_j,

P(si)P(sj)=Ωr(si)Ωr(sj)=exp(Sr(si)/k)exp(Sr(sj)/k)=e[Sr(si)Sr(sj)]/k\frac{P(s_i)}{P(s_j)} = \frac{\Omega_r(s_i)}{\Omega_r(s_j)} = \frac{\exp\Big({S_r(s_i)/k}\Big)}{\exp{\Big(S_r(s_j)/k\Big)}} = e^{[S_r(s_i) - S_r(s_j)] / k}

Now we will use the thermodynamic identity to connect entropy to energy of the system.

dSr=1T(dUr+PdVrμdNr)dS_r = \frac{1}{T}(dU_r + P\cancel{dV_r} - \mu \cancel{dN_r})

so we have

Sr(si)Sr(sj)=1T(Ur(si)Ur(sj))=1T(E(si)E(sj))S_r(s_i) - S_r(s_j) = \frac{1}{T}(U_r(s_i)-U_r(s_j)) = -\frac{1}{T}(E(s_i)-E(s_j))

where E(si)E(s_i) is the energy of the system at state sis_i, and the last equality is due to energy conservation. Thus, we have that

P(si)P(sj)=e[E(si)E(sj)]/kT\frac{P(s_i)}{P(s_j)} = e^{-[E(s_i)-E(s_j)]/kT}

We finally derived the Boltzmann statistics:

P(s)=1ZeE(s)/kTP(s) = \frac{1}{Z} e^{-E(s)/kT}

where $Z=seE(s)/kTZ = \sum_s e^{-E(s)/kT} is the normalizing factor known as the partition function* *and $eE(s)/kTe^{-E(s)/kT} is known as the **Boltzmann factor.

We can also find the average values of the system, like the expected energy:

E=1ZsE(s)eE(s)/kT\overline{E} = \frac{1}{Z} \sum_s E(s) e^{E(s)/kT}

Define βkT\beta \equiv kT, then we have that

Zβ=sE(s)eβE(s)=ZE\frac{\partial Z}{\partial \beta} = \sum_s - E(s) e^{\beta E(s)} = -Z \cdot \overline{E}

so we have

E=1ZZβ=(logZ)β\overline{E} = - \frac{1}{Z} \frac{\partial Z}{\partial \beta} = - \frac{\partial(\log Z)}{\partial \beta}

Quantum Statistics

For the Boltzmann statistics, we assume that the system does not exchange particles with the reservoir. If we instead allow exchange of particles with the reservoir, we can use the thermodynamic identity to derive that

Sr(si)Sr(sj)=1T(E(si)E(sj)μN(si)+μN(sj))S_r(s_i) - S_r(s_j) = -\frac{1}{T}(E(s_i)-E(s_j)-\mu N(s_i) + \mu N(s_j))

This leads to the quantum statistisc,

P(s)=1Ze[E(s)μN(s)]/kTP(s) = \frac{1}{Z} e^{-[E(s)-\mu N(s)]/kT}

where $Z=seE(s)μN(s)/kTZ = \sum_s e^{-E(s)-\mu N(s)/kT} is the normalizing factor known as the grand partition function or Gibbs Sum and e[E(s)μN(s)]/kTe^{-[E(s)-\mu N(s)]/kT} is known as the Gibbs factor.

Now here is the interesting part: from quantum mechanics (quantum field theory), we know that for fermions (e.g., electrons), a state can only be occupied by one particle, while for bosons, there is no such restriction. We will now discuss these two scenarios.

Fermi-Dirac Distrbution

For fermions, N(s)N(s) is either equal to 00 or 11, so

P(n)=1Z(en(ϵμ)/kT)P(n) = \frac{1}{Z}(e^{-n(\epsilon - \mu)/kT}) Z=1+e(ϵμ)/kTZ = 1 + e^{-(\epsilon - \mu)/kT}

where ϵ\epsilon is the energy of a single particle. Thus, the average number of particles in the system is

nFD=1Z(0P(0)+1P(1))=1e(ϵμ)/kT+1\overline{n}_{FD} = \frac{1}{Z}(0\cdot P(0) + 1 \cdot P(1)) = \frac{1}{e^{(\epsilon - \mu)/kT}+1}

Bose-Einstein Distrbution

For bosons, N(s)N(s) can take arbitary values, so

P(n)=1Z(en(ϵμ)/kT)P(n) = \frac{1}{Z}(e^{-n(\epsilon - \mu)/kT}) Z=n=0en(ϵμ)/kT=11e(ϵμ)/kTZ = \sum_{n=0}^{\infty} e^{-n(\epsilon - \mu)/kT} = \frac{1}{1- e^{-(\epsilon - \mu)/kT}}

And with some algebra, we can derive that

nBE=n=0nP(n)=1e(ϵμ)/kT1\overline{n}_{BE} = \sum_{n=0}^\infty n \cdot P(n) = \frac{1}{e^{(\epsilon - \mu)/kT} - 1}

Blackbody Radiation

Now we have enough tools to explain blackbody radiation, which was one of the dark clouds above classical physics. In classical physics, we treat electromagnetic radiation as a continuous field that permeates all space, and we would expect a black box to produce infinite thermal energy, known as the “ultraviolet catastrophe”. To solve this problem, Planck decides that the photons have to come in quanta of energy unit hfhf, where hh is the Planck constant and ff is the frequency of the electromagnetic wave. He derived the Planck distribution

nPl=1ehf/kT1\overline{n}_{Pl} = \frac{1}{e^{hf/kT} - 1}

If you realize that ϵ=hf\epsilon = hf and let μ=0\mu = 0 for photons, this is exactly the Bose-Einstien distribution! This is in fact right, photons have zero chemical potential, which means that they can be emitted or absorbed in any quantity and the number of photons is not conserved.

Reference

[1] Schroeder, D. V. (1999). An introduction to thermal physics.


By NowhereMan who goes nowhere.


© 2024, NowhereLog by NowhereMan