{{page>wiki:headers:hheader}} ====== Statistical View of Entropy ====== {{page>courses:prereq20:eepre:eefairness}} ===== In-class Content ===== ====QUIZ==== ====Lecture: Introduction to the Statistical Approach==== FIXME ==Lecture notes from Dr. Roundy's 2014 course website:== **A statistical approach** So far in this class, you have learned classical thermodynamics. Starting next week, we will be studying **statistical mechanics**. Thermodynamics may look ``theoretical'' because it involves a lot of math, but ultimately it is an experimental science. Thermodynamics puts severe (and interesting) constraints on equations of state, but can never tell us what the equations of state actually are. Similarly, thermodynamics can allow us to measure one quantity and use it to predict the result of a very different measurement. But it could never give us the ideal gas law, or the internal energy of an ideal gas. **Statistical mechanics** is the theoretical counterpart of **thermodynamics.** It's how we can predict thermodynamic quantities from first principles. It also allows us to use thermodynamic measurements to extract microscopic properties of a system. From quantum mechanics, you know that given a Hamiltonian describing a system, you could (in principle) solve for all the possible eigenstates and their energies. But how can you know which of those states a given system will be in? And given that state, how can you predict the result of interactions of the system with its surroundings, when you //don't// know the hamiltonian or eigenstates of the surroundings? These are the questions that are answered by statistical mechanics. **Inputs to stat mech:** Energies and eigenstates of the Hamiltonian. We can actually get much of what interests us out of just the energies, just as we could compute all the thermodynamic properties from $U(S,V)$, if only we knew what it was... or from $G(T,p)$, as you did in your homework. **Output of stat mech:** Probabilities (at a given temperature) of each energy eigenstate, $U$, $S$, $p$, $H$ and all thermodynamic functions. Statistical mechanics is awkward, so we will mostly want to use thermodynamics approaches when we can. e.g. if we know $U$ and $T$ and $S$, we can just use $F=U−TS$. **Large numbers** In macroscopic systems, there are many atoms and molecules, typically around $10^{23}$. As a result, we have no hope of actually examining every possible eigenstate, nor could we practically determine the precise microstate of the system. Instead, we need to examine how likely various states are. Average properties become extremely well-defined when many things are averaged. If I flip one coin, I'll get 50% heads, but with a pretty large uncertainty. When I flip 100 coins, I get 50 heads $±10$ coins. If I flip $10^{22}$ coins, I will get $5×10^{21}$ heads $±10^{11}$. This is a lot of uncertainty in the total number of heads, but a very small uncertainty in the fraction of coins that will end up being heads. ====Lecture: Fairness==== * This lecture should be a brief introduction to Fairness before the kinesthetic activity about calculating Fairness * Expect to get a question or two about how Fairness relates to the rest of thermodynamics ==Lecture notes from Dr. Roundy's 2014 course website:== The primary quantity in statistical mechanics is the probability $P_i$ of finding the system in eigenstate ii. Once we know the probability of each eigenstate for any given state, we will be able to compute every thermodynamic property of the system. The approach we are going to use is to state that the probabilities are those which maximize the fairness (or minimize the bias). So we need to define a **fairness function** $\mathcal{F}$ that we can maximize. First, let's talk about some properties the fairness function should satisfy. - it should be continuous - it should be symmetric $$\mathcal{F}(P_1,P_2,P_3,\ldots) = \mathcal{F}(P_3,P_2,P_1,\ldots)$$ - it should be minimum when $P_2=P_3=\ldots = 0$ and $P_1 = 1$ $$\mathcal{F}(1,0,0,\ldots) = \text{minimum}$$ - it should be maximum when $P_1 = P_2 = P_3 = ...$ $$\mathcal{F}(P,P,P,\ldots) = \text{maximum}$$ - **Addition rule** if I have two uncorrelated systems, then their fairness should add (extensivity!!!). This corresponds to the following: $$\mathcal{F}(P_A,P_B) + \mathcal{F}(P_1,P_2,P_3) = \mathcal{F}(P_AP_1, P_AP_2,P_AP_3,P_BP_1,P_BP_2,P_BP_3)$$ There aren't many functions which satisfies all these rules! $$\text{Fairness} = -k\sum_i^{\text{all states}} P_i \ln P_i$$ This particular function satisfies all these constraints. It is //continuous, symmetric, minimum when maximally unfair and maximum when maximally fair//. Continuous and symmetric are reasonably obvious. Let's show $\mathcal{F}$ is minimum when maximally unfair. This is when one $P_i=1$ and the rest are zero. $$\lim_{P\rightarrow 0} P\ln P = 0 \times \infty = \lim_{P\rightarrow 0} \frac{\ln P}{\frac1{P}} = \lim_{P\rightarrow 0} \frac{\frac1P}{-\frac1{P^2}} = 0$$ We next consider the contribution for the $P=1$, but that's easy, since $\ln⁡ 1=0$, so that term is also zero. Since $P\ln ⁡P$ can never be positive for $0≤P≤1$, we can see that the maximally unfair situation has minimum fairness, as it should. Next, let's consider the maximally-fair situation, where $P_1=P_2=\cdots=\frac1{N}$. To demonstrate that this is maximum fairness is more tricky, and will be addressed later, when we will go about maximizing the fairness. ====Activity: Combining Probabilities==== [[..:..:activities:eeact:eecombineprob| Link to Combining Probabilities Activity]] **Activity Highlights** {{page>activities:content:highlights:eecombineprob}} ===== Homework for Energy and Entropy ===== {{page>courses:hw20:eehw:eefairness&noheader}} {{page>wiki:footers:courses:eehourfooter}}