Lecture series at the
School on Information, Noise, and Physics of Life
Nis 19.-30. September 2022
by Jan Korbel
all slides can be found at: slides.com/jankorbel
(Korbel = Tankard = Bierkrug)
You have 3 minutes to write down on a piece of paper:
a) Your name
b) What do you study
c) What is entropy to you? (Formula/Concept/Definition/...)
Located at Vienna central cemetery
(Wien Zentralfriedhof)
We will get back to this formula
Randomness?
Disorder?
Energy dispersion?
Maximum data compression?
'Distance' from equilibrium?
Uncertainty?
Heat over temperature?
Information content?
Part of the internal energy unavailable for useful work?
MaxEnt
MaxCal
SoftMax
MaxEP
Prigogine
Field: mathematical physics
warning: Personal opinion!
SS 1st year Bc. - Thermodynamics
\(\mathrm{d} S = \frac{\delta Q}{T} \)
\(C_v = T \left( \frac{\partial S}{\partial T}\right)_V\)
\(\left ( \frac{\partial S}{\partial V}\right)_T = \left(\frac{\partial p}{\partial T} \right)_V \)
SS 2nd year Bc. - Statistical physics
\( S = - \sum_k p_k \log p_k\)
\(Z = \sum_k e^{-\beta \epsilon_k}\)
\(\ln Z = S - U/T\)
SS 3rd year Bc. Quantum mechanics 2
\( S = -Tr (\rho \log \rho) \)
\( Z = Tr (\exp(-\beta \hat{H}))\)
Bachelor's studies
differential forms?
probability theory?
Master's studies
Erasmus exchange
@ FU Berlin
WS - 2nd year MS - Advanced StatPhys
Fermi-Dirac & Bose-Einstein statistics
Ising spin model and transfer matrix theory
Real gas and virial expansion
WS - 2nd year MS - Noneq. StatPhys
Onsager relations
Molecular motors
Fluctuation theorems
by
1. relation between energy, heat, work and temperature
Thermodynamics (should be rather thermoSTATICS)
2. relation between microscopic and macroscopic
R. Clausius
Lord Kelvin
H. von Helmholtz
S. Carnot
J. C. Maxwell
L. Boltzmann
M. Planck
J. W. Gibbs
Statistical mechanics/physics
Microscopic to Macroscopic
Role of statistics in physics
Classical mechanics (quantum mechanics)
- position & momenta given by equations of motion
- 1 body problem: solvable
- 2 body problem: center of mass transform
- 3 body problem: generally not solvable
...
- N body problem: ???
Do we need to know trajectories of all particles?
Let's have canonical coordinates \(\mathbf{q}(t)\), \(\mathbf{p}(t)\) evolving by Hamiltonian dynamics
$$\dot{\mathbf{q}} = \frac{\partial H}{\partial \mathbf{p}}\qquad \dot{\mathbf{p}} = - \frac{\partial H}{\partial \mathbf{q}}$$
Let \(\rho(p,q,t)\) be a probability distribution in the phase space. Then, \(\frac{\mathrm{d} \rho}{\mathrm{d} t} = 0.\)
Consequence: \( \frac{\mathrm{d} S(\rho)}{\mathrm{d} t}= - \frac{\mathrm{d}}{\mathrm{d} t} \left(\int \rho(t) \ln \rho(t)\right) = 0.\)
1. Law of large numbers (LLN)
\( \sum_{i=1}^n X_i \rightarrow n \bar{X} \quad \mathrm{for} \ N \gg 1\)
2. Central limit theorem (CLT)
\( (\frac{1}{n} \sum_{i=1}^n X_i - \bar{X}) \rightarrow \frac{1}{\sqrt{n}} \mathcal{N}(0,\sigma^2)\)
Consequence: a large number of i.i.d. subsystems can be described by very few parameters for \(N \gg 1\)
\(\Rightarrow\) e.g., a box with 1 mol of gas particles
Bars & Stars theorems (|*)
\(\bar{X}\)
Microscopic systems
Classical mechanics (QM,...)
Mesoscopic systems
Stochastic thermodynamics
Macroscopic systems
Thermodynamics
Trajectory TD
Ensemble TD
Statistical mechanics
Consider again a dice with 6 states
Let us throw a dice 5 times. The resulting sequence is
Microstate
The histogram of this sequence is
0
0
2
1
1
1
Mesostate
The average value is 3,8 Macrostate
Coarse-graining
Coarse-graining
# micro: \(6^5 =7776\)
# meso: \(\binom{6+5-1}{5} =252\)
# macro: \( 5\cdot 6-5\cdot 1 =25\)
Now we come back to the formula on Boltzmann's grave
Question: how do we calculate multiplicity W for mesostate
Answer: see combinatorics lecture.
Full answer: 1.) permute all states, 2.) take care of overcounting
1.) Permuation of all states: 5! = 120
2.) Overcounting - permutation of 2! = 2
Together: \(W(0,2,0,1,1,1) = \frac{5!}{2!} =60\)
0
0
2
1
1
1
$$W(n_1,\dots,n_k) = \left(\frac{\sum_{i=1}^k n_i}{n_1, \ \dots \ ,n_k}\right) = \frac{(\sum_{i=1}^k n_k)!}{\prod_{i=1}^k n_i!}$$
Succint reason: \(\log\) transforms \(\prod\) to \(\sum\)
(similar to log-likelihood funciton)
Physical reason: multiplicity of \(X \times Y\)
is \(W(X)W(Y)\)
(extensivity/intensivity of thermodynamic variables)
\(\log W(n_1,\dots,n_k) = n \log n - \cancel{n} - \sum_{i=1}^k n_i \log n_i + \cancel{\sum_{i=1}^k n_i} \)
\(= \sum_{i=1}^k n_i (\log n - \log n_i) = - \sum_{i=1}^k n_i \log \frac{n_i}{n}\)
Stirling's approximation: \( \log(n!) \approx n \log n - n + \mathcal{O}(\log n) \)
Denote: \(\sum_{i=1}^k n_k = n\).
Denote: \(n_i/n = p_i\).
$$\log W(n_1,\dots,n_k) = - n \sum_{i=1}^k p_i \log p_i $$
What is actually \(p_i\)?
In probability, there are two interpretations of probability
1. Frequentist approach
probability is the limiting success value of a repeated experiment
$$p = \lim_{n \rightarrow \infty} \frac{k(n)}{n}$$
It can be estimated as \(\hat{p} = \frac{X_1+\dots+X_n}{n}\) and it does not make any sense to consider parametric distribution.
2. Bayesian approach
probability quantifies our uncertainty about the experiment. By observing the experiment we can update our knowledge about it
$$\underbrace{f(p|\hat{p})}_{posterior} = \underbrace{\frac{f(\hat{p}|p)}{f(\hat{p})}}_{likelihood \ ratio} \underbrace{f(p)}_{prior} $$
LLN
By using the relation \(n_i/n = p_i\), we actually used the frequentist definition of probability. As a consequence, it means that \(n \rightarrow \infty\) (in practical situations \(n \gg 1\)). This limit is in physics called thermodynamic limit.
There are a few natural questions:
Does it mean that the entropy can be used only in the thermodynamic limit?
Does the entropy measure the uncertainty of a single particle in a large system or some kind of average probability over many particles?
(LLN & CLT)
N.B.: does anybody recognize what is \(H_G - H_B\)?
Do we consider states of a single dice?
Do we consider states of a pair of dices?
etc.
Do we consider states of an n-tuple of dices?
...
Excercise activity for you: can you derive Gibbs entropy from considering the state space of n-tuples of dices?
\(\Delta S = k N\ln 2\)
Resolution
1. Simply multiply entropy by
\(1/N!\) - due to "quantum" reasons (indistinguishability)
2. Swendsen approach
(we will come back to it later)
Additivity: We have two independent systems \(A\) and \(B\) $$S(A,B) = S(A) + S(B)$$
Extensivity: We have a system of N particles, then
$$S(kN) = k \cdot S(N)$$