You are here: start » courses » order20 » eeorder20 » eefairness

# Statistical View of Entropy

## Prerequisites

Before beginning this unit, student should:

- Have experience with summation notation.
- Have experience in quantum mechanics, particularly with eigenvalues and probabilities.
- Know the difference between intensive and extensive variables.

## In-class Content

### QUIZ

### Lecture: Introduction to the Statistical Approach

##### Lecture notes from Dr. Roundy's 2014 course website:

**A statistical approach**

So far in this class, you have learned classical thermodynamics. Starting next week, we will be studying **statistical mechanics**. Thermodynamics may look “theoretical” because it involves a lot of math, but ultimately it is an experimental science. Thermodynamics puts severe (and interesting) constraints on equations of state, but can never tell us what the equations of state actually are. Similarly, thermodynamics can allow us to measure one quantity and use it to predict the result of a very different measurement. But it could never give us the ideal gas law, or the internal energy of an ideal gas.

**Statistical mechanics** is the theoretical counterpart of **thermodynamics.** It's how we can predict thermodynamic quantities from first principles. It also allows us to use thermodynamic measurements to extract microscopic properties of a system.

From quantum mechanics, you know that given a Hamiltonian describing a system, you could (in principle) solve for all the possible eigenstates and their energies. But how can you know which of those states a given system will be in? And given that state, how can you predict the result of interactions of the system with its surroundings, when you *don't* know the hamiltonian or eigenstates of the surroundings? These are the questions that are answered by statistical mechanics.

**Inputs to stat mech:**

Energies and eigenstates of the Hamiltonian. We can actually get much of what interests us out of just the energies, just as we could compute all the thermodynamic properties from $U(S,V)$, if only we knew what it was… or from $G(T,p)$, as you did in your homework.

**Output of stat mech:**

Probabilities (at a given temperature) of each energy eigenstate, $U$, $S$, $p$, $H$ and all thermodynamic functions. Statistical mechanics is awkward, so we will mostly want to use thermodynamics approaches when we can. e.g. if we know $U$ and $T$ and $S$, we can just use $F=U−TS$.

**Large numbers**

In macroscopic systems, there are many atoms and molecules, typically around $10^{23}$. As a result, we have no hope of actually examining every possible eigenstate, nor could we practically determine the precise microstate of the system. Instead, we need to examine how likely various states are. Average properties become extremely well-defined when many things are averaged.

If I flip one coin, I'll get 50% heads, but with a pretty large uncertainty. When I flip 100 coins, I get 50 heads $±10$ coins. If I flip $10^{22}$ coins, I will get $5×10^{21}$ heads $±10^{11}$. This is a lot of uncertainty in the total number of heads, but a very small uncertainty in the fraction of coins that will end up being heads.

### Lecture: Fairness

- This lecture should be a brief introduction to Fairness before the kinesthetic activity about calculating Fairness
- Expect to get a question or two about how Fairness relates to the rest of thermodynamics

##### Lecture notes from Dr. Roundy's 2014 course website:

The primary quantity in statistical mechanics is the probability $P_i$ of finding the system in eigenstate ii. Once we know the probability of each eigenstate for any given state, we will be able to compute every thermodynamic property of the system.

The approach we are going to use is to state that the probabilities are those which maximize the fairness (or minimize the bias). So we need to define a **fairness function** $\mathcal{F}$ that we can maximize. First, let's talk about some properties the fairness function should satisfy.

- it should be continuous
- it should be symmetric $$\mathcal{F}(P_1,P_2,P_3,\ldots) = \mathcal{F}(P_3,P_2,P_1,\ldots)$$
- it should be minimum when $P_2=P_3=\ldots = 0$ and $P_1 = 1$ $$\mathcal{F}(1,0,0,\ldots) = \text{minimum}$$
- it should be maximum when $P_1 = P_2 = P_3 = …$ $$\mathcal{F}(P,P,P,\ldots) = \text{maximum}$$
**Addition rule**if I have two uncorrelated systems, then their fairness should add (extensivity!!!). This corresponds to the following:

$$\mathcal{F}(P_A,P_B) + \mathcal{F}(P_1,P_2,P_3) = \mathcal{F}(P_AP_1, P_AP_2,P_AP_3,P_BP_1,P_BP_2,P_BP_3)$$ There aren't many functions which satisfies all these rules! $$\text{Fairness} = -k\sum_i^{\text{all states}} P_i \ln P_i$$ This particular function satisfies all these constraints. It is *continuous, symmetric, minimum when maximally unfair and maximum when maximally fair*. Continuous and symmetric are reasonably obvious.

Let's show $\mathcal{F}$ is minimum when maximally unfair. This is when one $P_i=1$ and the rest are zero. $$\lim_{P\rightarrow 0} P\ln P = 0 \times \infty = \lim_{P\rightarrow 0} \frac{\ln P}{\frac1{P}} = \lim_{P\rightarrow 0} \frac{\frac1P}{-\frac1{P^2}} = 0$$ We next consider the contribution for the $P=1$, but that's easy, since $\ln 1=0$, so that term is also zero. Since $P\ln P$ can never be positive for $0≤P≤1$, we can see that the maximally unfair situation has minimum fairness, as it should.

Next, let's consider the maximally-fair situation, where $P_1=P_2=\cdots=\frac1{N}$. To demonstrate that this is maximum fairness is more tricky, and will be addressed later, when we will go about maximizing the fairness.

### Activity: Combining Probabilities

Link to Combining Probabilities Activity

**Activity Highlights**

- This small group activity is designed to help students understand how to combine probabilities in non-interacting systems.
- Students calculate the probability, for two uncorrelated systems A and B, of system A being in state i and system B being in state j, given the probabilities for each independent system.
- The wrap-up discussion reinforces the concept that probabilities are multiplied, and introduces the mathematical notation used to describe probabilities.

## Homework for Energy and Entropy

- (ThermodynamicPotentialsAndMaxwellRelations) PRACTICE
*What goes here?*For the three thermodynamic potentials defined as

\begin{quote} \(F=U-T S\) \hfill Helmholtz free energy

\(H=U+P V\) \hfill Enthalpy

\(U-T S+P V\) \hfill Gibbs free energy \end{quote}

determine the total differentials for each thermodynamic potential: $dF$, $dH$, and $dG$. Use the thermodynamic identity ($dU=T dS-p dV$) to simplify.

Identify a Maxwell relation for each of the three potentials.

- (TwoLevelSystem)
*What goes here?***A two level system**A system consists of $N$ identical non-interacting (independent) objects. Each object has two energy eigenstates labeled $A$ and $B$, with energies $E_A=E_0$ and $E_B=-E_0$ ($E_0=\left|E_0 \right|$).Two students argue that this object can't be real because the low energy state is not zero, i.e. $E_B=0$ and $E_A=2 E_0$. What is the effect of the energy of the lowest energy state on thermal averages?

As a function of temperature $T$, what are the probabilities $p_A$ and $p_B$ of finding an object in this system in eigenstates $A$ and $B$?

Calculate the internal energy $U$ of this system as a function of temperature.

Each object has a property $X$, which takes the values of $X_A=+1$ when an object is in state $A$ and $X_B=-1$ when an object is in state $B$. Solve for the average value $\langle X \rangle$ as a function of temperature of the property $X$ for this system.

What are the limits for $T\rightarrow 0$ and $T\rightarrow\inf$ for $\langle X \rangle$?