Thermal and Statistical Mechanics
August 15, 2023
This post gives a gentle introduction to thermal and statistical mechanics, including mysterious concepts like entropy, Boltzmann and quantum statistics, and Bose-Eeinstien condensation.
Entropy
The journey of the thermal and statistical mechanics starts with entropy, one of the most well-known and misunderstood concepts in Physics. Start from the basic definitions, we will see that it is actually very intuitive and can be explained with simple combinatorics.
Thermal and statistical mechanics studies complex systems of particles. Let’s first imagine that our system is consiting of coin-like particles, which have states of either heads or tails. If you find it too ridiculous, just think about electrons with spin up and down. Supppose that our system has such particles. The particles have identical properties, but if we virtually give each particle a label, there are possible states, which we call microstates. But since the particles are identical, many such states are indistinguishable. Actually, there are only 101 mascrostates, which is determined by the number of particles with heads, or equivalently by the number of particles with tails. Let’s denote the number of particles with heads as , and we denote the number of microstates in a specific macrostate as , or the multiplicity of the macrostate. In this case, . The sum of the multiplicities should always equal the total number of microstates, in this case, .
Now here comes the the fundamental assumption of statistical mechanics:
In an isolated system in thermal equilibrium, all accessible microstates are equally probable.
In classical mechanics, physical interactions are deterministic, and we have a clockwork world. Quantum mechanics subverts this world view, and theorizes that many properties of particles are uncertain before measurements. Statistical mechancis forgoes determining the interactions of individual particles, because the systems of interest are too complex. Even though the fundamental assumption is probabilistic in nature, it is different from the probability wave in quantum mechanics (which comes after statistical mechanics).
The implication of this simple and intuitive assumption is profound. Since each microstate is equally probable, the macrostate with the greatest multiplicity is the most likely to occur, or “the multiplicity tends to increase”. Now we introduce the definition of entropy ,
where is the Boltzmann constant.
The “law of the increase of multiplicity” now becomes the “law of the increase of entropy”, one version of the well known second law of thermodynamics, which is actually just a strong statement about probabilities.
The second law of thermodynamics: Any large system equalibrium will be found in the state with largest multiplicity (largest entropy).
From here, we will consider different types of interactions between systems and derive familiar concepts like temperature and pressure.
Temperature
The concept of temperature arises when we consider thermal interactions, or the flow of heat between objects. We find that heat flows from objects with high temperature to objects with low temperature, and we will introduce a statistical mechanics definition in accordance to this behavior.
Suppose we have two systems and , with entropy and , energy and . Let the total entropy be . We will assume that only thermal interactions occur, and specifically, the total number of particles in each system and the volumes of the systems are constants. According to the second law, at the thermal equalibrium, the multiplicity or entropy is maximized, or
Finally, since energy is conserved, is cosntant, and , so
This suggests that temperature should somehow relates to . In fact, the Boltzmann constant has the unit , so has the unit . This suggests the definition of temperature as
where the subscripts suggests that the number of particles and the volume are constants. In other words, heat tends to transfer to objects with larger or lower temperature.
Pressure
The concept of pressure arises when we consider mechanical interactions, or the change of volumes of two systems. We would expect systems with high pressure will tend to expand, or “receive volume” from systems with lower pressure. Suppose again we have two systems and , with entropy and , volumes and . Similar to thermal interactions, we have that at the equalibrium
We can again play with dimensions of different quantities to realize that a reasonable combination to produce pressure is
We can actually show that this is consistent with the usual definition of pressure.
Chemical Potential
Chemical potential is a less well-known concept and arises when we consider diffusions, or the movement of particles between systems. We define chemical potential such that particles will flow from systems with larger chemical potential to systems with small potential.
The name chemical potential can be somewhat confusing since has the unit of energy.
The Thermodynamic Identity
Now we’ve introduced all the quantities relevant to entropy, we have the following identity:
and we can rearrange to derive the thermodynamic identity:
which summarizes all the physics we’ve entroduced so far.
Boltzmann Statistics
We’ve started with entropy, and connects it with usual physical properties. We will now use entropy or multiplicity to derive probability of microstates, and then connect these probabilities with the usual physical properties.
For now, let’s consider a system interacting with a reservoir. We assume that the change in temperature and volume of the reservoir is neglegible, and the system does not exchange particles with the reservoir. We use to denote the multiplicity of the reservoir when the system is at state . Then we have that for two states ,
Now we will use the thermodynamic identity to connect entropy to energy of the system.
so we have
where is the energy of the system at state , and the last equality is due to energy conservation. Thus, we have that
We finally derived the Boltzmann statistics:
where $ is the normalizing factor known as the partition function* *and $ is known as the **Boltzmann factor.
We can also find the average values of the system, like the expected energy:
Define , then we have that
so we have
Quantum Statistics
For the Boltzmann statistics, we assume that the system does not exchange particles with the reservoir. If we instead allow exchange of particles with the reservoir, we can use the thermodynamic identity to derive that
This leads to the quantum statistisc,
where $ is the normalizing factor known as the grand partition function or Gibbs Sum and is known as the Gibbs factor.
Now here is the interesting part: from quantum mechanics (quantum field theory), we know that for fermions (e.g., electrons), a state can only be occupied by one particle, while for bosons, there is no such restriction. We will now discuss these two scenarios.
Fermi-Dirac Distrbution
For fermions, is either equal to or , so
where is the energy of a single particle. Thus, the average number of particles in the system is
Bose-Einstein Distrbution
For bosons, can take arbitary values, so
And with some algebra, we can derive that
Blackbody Radiation
Now we have enough tools to explain blackbody radiation, which was one of the dark clouds above classical physics. In classical physics, we treat electromagnetic radiation as a continuous field that permeates all space, and we would expect a black box to produce infinite thermal energy, known as the “ultraviolet catastrophe”. To solve this problem, Planck decides that the photons have to come in quanta of energy unit , where is the Planck constant and is the frequency of the electromagnetic wave. He derived the Planck distribution
If you realize that and let for photons, this is exactly the Bose-Einstien distribution! This is in fact right, photons have zero chemical potential, which means that they can be emitted or absorbed in any quantity and the number of photons is not conserved.
Reference
[1] Schroeder, D. V. (1999). An introduction to thermal physics.