O. Moussa
Organization / Interactivity / Embedded Formative Assessment
@trang1618
A microstate specifies the position and velocity of all the atoms in the coffee.
A macrostate specifies temperature, pressure on the side of the cup, total energy, volume, etc.
a
Boltzman constant
number of configurations
a
\[S = - k_B\sum_i p_i\log p_i = k_B\log \Omega\]
(assume each microstate is equally probable)
\[S = - k_B\sum_i p_i\log p_i\]
expected value of
log(probability that a microstate \(i\) is occupied)
[J/K]
Entropy always increases.
Why is that?
Because it's overwhelmingly more likely that it will.
a
Shannon's entropy
probability of the
possible outcome \(x_i\)
a
\[H(X) = -\sum_{i = 1}^np(x_i)\log_2p(x_i)\]
[bits]
Source coding theorem.
The Shannon entropy of a model ~ the number of bits of information gain by doing an experiment on a system your model describes.