This is an old revision of the document!


From Statistics to Thermal Properties

Prerequisites

Before approaching this topic, students should:

  • Know the first and second thermodynamic laws.
  • Be familiar with the fairness function and how to manipulate it.
  • Have the ability to perform derivatives and manipulate mixed partials.

In-class Content

Lecture: Weighted Averages

FIXME

Lecture notes from Dr. Roundy's 2014 course website:

Most thermodynamic quantities can be expressed as weighted averages over all possible eigenstates (or microstates). For instance, the internal energy is given by: by: $$U = \sum_i P_i E_i$$ Note that this will probably not be an eigenvalue of the energy, but that's okay. The energy eigenvalues are so close for the total energy of a macroscopic object that we couldn't distinguish them anyhow. Any thermodynamic quantity that is defined for a microstate will be computed using precisely this sort of average: this will also cover magnetization and pressure, for instance, or the intensity of electromagnetic radiation at a given frequency.

Lecture: Fairness and Entropy

  • This lecture should give a brief description about how the Fairness Function relates to entropy
  • This lecture may be moved around if students ask about it early on in the discussion of Fairness
Lecture notes from Dr. Roundy's 2014 course website:

Given two uncorrelated systems, AA and BB, we can show that the fairness of the combined system is equal to the sum of the fairnesses of the two separate systems. This means that FF is extensive. $$\mathcal{F}_A = -k \sum_i P_i \ln P_i$$ $$\mathcal{F}_B = -k \sum_i P_i \ln P_i$$ $$\mathcal{F}_{AB} = -k \sum_{ij} P_{ij} \ln\left( P_{ij} \right)$$ $$= -k \sum_{ij} P_iP_j \ln\left( P_iP_j \right)$$ $$= -k \sum_{ij} P_i P_j \left(\ln P_i + \ln P_j\right) $$ $$= -k \left(\sum_{ij} P_i P_j \ln P_i\right) -k \left(\sum_{ij} P_i P_j \ln P_j\right)$$ $$= -k \left(\sum_i P_i \ln P_i\right)\left(\sum_j P_j \right) -k \left(\sum_j P_j \ln P_j\right)\left(\sum_i P_i\right)$$ $$= -k \left(\sum_i P_i \ln P_i\right) -k \left(\sum_j P_j \ln P_j\right)$$ $$= \mathcal{F}_A + \mathcal{F}_B$$

Lecture: Relating Internal Energy and Fairness (15 minutes)

  • The appropriate time for this lecture may vary. If a student questions the relationship, then it may be wise to skip to the answer.
  • Students will often be confused by the relationship
  • Different students have different methods of understanding the relation, so it may be good to describe it in several ways
Lecture notes from Dr. Roundy's 2014 course website:

$\newcommand\myderiv[3]{\left(\frac{\partial #1}{\partial #2}\right)_{#3}}$ Let's talk a bit about fairness. We used the fairness to find the probabilities of being in the various eigenstates, by assuming that the “fairest” distribution would prevail. If you bring two separate systems together and allow them to equilibrate, then you would expect that the net fairness would either remain the same or would increase. This sounds a little like entropy in the second law, in that the net entropy of system plus surroundings can increase or stay the same, but cannot decrease. The maximum value of the fairness for a given system (which is the value it will have in equilibrium) is its entropy.

Solving for maximum fairness

Let's look the maximum value of the fairness (a.k.a. entropy), which is $$\mathcal{F} = -k \sum_i P_i\ln P_i$$ $$U = \sum_i P_i E_i $$ $$P_i = \frac{e^{-\beta E_i}}{Z}$$ $$\mathcal{F}_\text{max} = -k_B\sum_i P_i \ln P_i$$ $$= -k_B \sum_i \frac{e^{-\beta E_i}}{Z} \ln \left(\frac{e^{-\beta E_i}}{Z}\right)$$ $$= -k_B\sum_i \frac{e^{-\beta E_i}}{Z} \left( -\beta E_i - \ln Z\right)$$ $$= k_B\beta \sum_i \frac{E_i e^{-\beta E_i}}{Z} + k_B \sum_i \frac{\ln Z e^{-\beta E_i}}{Z}$$ $$\mathcal{F}_\text{max} = k_B\beta U + k_B\ln Z$$ At this point, we may want to solve for $U$ again, to get yet another relationship for $U$: $$U = \frac{\mathcal{F}_\text{max}}{k_B\beta} - \frac{1}{\beta}\ln Z \qquad\qquad (9)$$ We saw before that $\ln ⁡Z$ was extensive, so we can now conclude that $\beta$ is intensive. From which it is also clear that entropy is extensive (which we already knew). Since we believe that $$S = \mathcal{F}_\text{max}$$ let's see what else we can extract from Equation 9 for $U$. We also know that $$dU = TdS - pdV$$ $$T = \myderiv{U}{S}{V}$$ Since we have an equation for $U$ in terms of $S$, we just need to figure out how to hold $V$ constant, and we'll know what $T$ is!

What does it mean to hold $V$ constant? It hasn't shown up in any of our statistical equations? If we change the volume, we will change the energy eigenvalues, so if we hold $V$ constant (and in general, do no work) then the energy eigenvalues are fixed. So until we explicitly add states with different volumes, then $V$ is held constant in our calculations, and thus we should be able to evaluate the derivative of $U$ with respect to $S$ at fixed $V$ to find the temperature. $$U = \frac{S}{k_B\beta} - \frac{1}{\beta}\ln Z$$ $$T = \myderiv{U}{S}{V}$$ $$= \frac{1}{k\beta} - \frac{S}{k\beta^2}\myderiv{\beta}{S}{V} + \frac{\ln Z}{\beta^2}\myderiv{\beta}{S}{V} - \frac1{Z\beta}\myderiv{Z}{S}{V}$$ $$= \frac{1}{k\beta} - \frac{S}{k\beta^2}\myderiv{\beta}{S}{V} + \frac{\ln Z}{\beta^2}\myderiv{\beta}{S}{V} - \frac1{Z\beta}\myderiv{Z}{\beta}{V} \myderiv{\beta}{S}{V}$$ $$= \frac{1}{k\beta} + \frac1{\beta}\left(- \frac{S}{k\beta} + \frac{\ln Z}{\beta} - \frac1{Z}\myderiv{Z}{\beta}{V} \right) \myderiv{\beta}{S}{V}$$ $$= \frac{1}{k\beta} + \frac1{\beta}\left(- \frac{S}{k\beta} + \frac{\ln Z}{\beta} + U \right) \myderiv{\beta}{S}{V} \qquad\qquad (18)$$ $$= \frac1{k\beta}$$ $$\beta = \frac1{kT}$$ where in step 18, we used the equation for $U$, Equation 9, which saved us the tedium of evaluating $\myderiv{\beta}{S}{V}$. By inserting this definition for $\beta$, into Equation 9, we can see that $$U = TS - kT\ln Z$$ $$-kT\ln Z = U - TS$$ $$F = -k_BT \ln Z$$ So it turns out that the log of the partition function just about gives us the Helmholtz free energy! $\ddot\smile$ This is often a bit more useful than our expression for $U$, since we know the derivatives of $F$ with regard to $T$: $$dF = -SdT - pdV$$ So we could conveniently evaluate $S$ by taking a derivative of the Helmholtz free energy—which would take us in a bit of a circle, but would allow us to express $S$ directly in terms of the partition function $Z$. Once we have computed $F(T,V)$ using statistical mechanics—which really only requires that we evaluate the partition function—we can use ordinary thermodynamics to compute all other thermodynamic quantities!

Homework for Energy and Entropy

  1. (BoltzmannRatio) What goes here?

    At low temperatures, a diatomic molecule can be well described as a rigid rotor. The Hamiltonian of such a system is simply proportional to the square of the angular momentum \begin{align} H &= \frac{1}{2I}L^2 \end{align} and the energy eigenvalues are \begin{align} E_{lm} &= \hbar^2 \frac{l(l+1)}{2I} \end{align}

    1. What is the energy of the ground state and the first and second excited states of the $H_2$ molecule?

    2. At room temperature, what is the relative probability of finding a hydrogen molecule in the $l=0$ state versus finding it in any one of the $l=1$ states?\\ i.e. what is $P_{l=0,m=0}/\left(P_{l=1,m=-1} + P_{l=1,m=0} + P_{l=1,m=1}\right)$

    3. At what temperature is the value of this ratio 1?

    4. At room temperature, what is the probability of finding a hydrogen molecule in any one of the $l=2$ states versus that of finding it in the ground state?\\ i.e. what is $P_{l=0,m=0}/\left(P_{l=2,m=-2} + P_{l=2,m=-1} + \cdots + P_{l=2,m=2}\right)$

  2. (NucleusInMagneticField) What goes here?

    Nuclei of a particular isotope species contained in a crystal have spin $I=1$, and thus, $m = \{+1,0,-1\}$. The interaction between the nuclear quadrupole moment and the gradient of the crystalline electric field produces a situation where the nucleus has the same energy, $E=\varepsilon$, in the state $m=+1$ and the state $m=-1$, compared with an energy $E=0$ in the state $m=0$, i.e. each nucleus can be in one of 3 states, two of which have energy $E=\varepsilon$ and one has energy $E=0$.

    1. Find the Helmholtz free energy $F = U-TS$ for a crystal containing $N$ nuclei which do not interact with each other.

    2. Find an expression for the entropy as a function of temperature for this system. (Hint: use results of part a.)

    3. Indicate what your results predict for the entropy at the extremes of very high temperature and very low temperature.


Personal Tools