Jan Korbel
CSH Winter School 2025
Slides available at: https://slides.com/jankorbel
Personal web: https://jankorbel.eu
This talk will consist of three parts:
1. Brief introduction to statistical physics
2. Statistical physics of spin systems
3. Statistical physics of social systems
The aim of this talk is to make an overview of models inspired by statistical physics in opinion dynamics models and other social models.
No prior knowledge of statistical physics is needed.
What if we have 1 mol (\(\approx 10^{23})\) particles?
For systems with a large number of DOF, it is typically not possible to get the exact microscopic description
We typically do not need to know the exact microstate, we only want to know the specific macroscopic properties
The main concept is coarse-graining
position & velocity of each particle
(\(6 \times N_A \approx 10^{24} DOF)\)
volume, temperature, pressure
(a few thermodynamic variables)
Microscopic description
Classical mechanics (QM,...)
Mesoscopic description
Statistical physics
Macroscopic description
Thermodynamics
probability of measuring a particle
with given position and velocity
General description
\(\bar{X}= \frac{1}{n} \sum_i x_i\)
Coarse-graining = keeping the relevant information on a larger scale while omitting the details of the system
Consider dice with 6 states
Let us throw a dice 5 times. The resulting sequence is
Microstate
How many times did a dice roll...
0
0
2
1
1
1
Mesostate
The average value is \(\bar{X} = 3.8\) Macrostate
Coarse-graining
Coarse-graining
# micro: \(6^5 =7776\)
# meso: \(\binom{6+5-1}{5} =252\)
# macro: \( 5\cdot 6-5\cdot 1 =25\)
Question: how do we calculate multiplicity W for mesostate
1.) Permute all states
2.) Take care of overcounting
1.) Number of permutations: 5! = 120
2.) Overcounting - permutation of 2! = 2
Together: \(W(0,2,0,1,1,1) = \frac{5!}{2!} =60\)
0
0
2
1
1
1
Permutations:
1.
2.
3.
...
$$W(n_1,\dots,n_k) = \left(\frac{\sum_{i=1}^k n_i}{n_1, \ \dots \ ,n_k}\right) = \frac{(\sum_{i=1}^k n_k)!}{\prod_{i=1}^k n_i!}= \frac{n!}{n_1! n_2! \dots n_k!}$$
for the case, when the individual dices are statistically independent, we end with a general formula for the multiplicity
From this, we can use the famous Boltzmann formula
Boltzmann's constant
\(k=1.380649 \times 10^{-23} J K^{-1}\)
$$S = \log W \approx n\log n - n - \sum_{i} \left(n_i \log n_i - n_i \right)$$
Here, we use the normalization \(n = \sum_i n_i\)
and introduce probabilities \(p_i = n_i/n\)
$$ S = - n\sum_i p_i \log p_i$$
Finally the entropy per particle is
We use the Boltzmann formula (we set \(k_B = 1\) )
and Stirling's approximation \(\ln x! \approx x \ln x - x\)
We actually ended with the formula that is known as Shannon entropy in information theory
\( \mathcal{S} = S/n = -\sum_i p_i \log p_i \)
By using the relation \(n_i/n = p_i\), we actually used the law of large numbers. It says that for large \(n\), the relative frequency of an event converges to its probability, i.e.,
\(p_i = \lim_{n \rightarrow \infty} \frac{n_i(n)}{n}\)
This limit is in physics called thermodynamic limit.
What is actually \(p_i\)?
0
0
1
1
1
2
P( )=1/3
Properties of entropy:
Thus, it measures the average amount of surprise (\(h_i=- \log p_i\))
when observing an event \(i\)
0
0
2
1
1
1
Possible mesostates:
0
1
2
0
1
1
How many mesostates are there?
0
0
1
0
4
0
etc.
Bars & Stars theorems (|*)
We introduce the following notation:
0
0
1
1
1
\(=(6,5,4,2,2)\)
2
(6,6,5,1,1) | (6,6,4,2,1) | (6,6,3,3,1) |
---|---|---|
(6,6,3,2,2) | (6,5,5,2,1) | (6,5,4,3,1) |
(6,5,4,2,2) | (6,5,3,3,2) | (6,4,4,4,1) |
(6,4,4,3,2) | (6,4,3,3,3) | (5,5,5,3,1) |
(5,5,4,4,1) | (5,5,4,3,2) | (5,4,4,4,2) |
(5,4,4,3,3) | (4,4,4,4,3) | (5,5,5,2,2) |
(5,5,3,3,3) |
All mesostates with \(\bar{X} = 3.8\)
Q: What is the probability of observing such mesostate?
A: It is the multiplicity!
W(6,6,5,1,1)=30 | W(6,6,4,2,1)=60 | W(6,6,3,3,1)=30 |
---|---|---|
W(6,6,3,2,2)=30 | W(6,5,5,2,1)=60 | W(6,5,4,3,1)=120 |
W(6,5,4,2,2)=60 | W(6,5,3,3,2)=60 | W(6,4,4,4,1)=20 |
W(6,4,4,3,2)=60 | W(6,4,3,3,3)=20 | W(5,5,5,3,1)=20 |
W(5,5,4,4,1)=30 | W(5,5,4,3,2)=60 | W(5,4,4,4,2)=20 |
W(5,4,4,3,3)=30 | W(4,4,4,4,3)=5 | W(5,5,5,2,2)=10 |
W(5,5,3,3,3)=10 |
$$W(n_1,\dots,n_k) = \frac{n!}{n_1! n_2! \dots n_k!}$$
P( )
P3(6,6,5,1,1)=0 | P3(6,6,4,2,1)=0 | P3(6,6,3,3,1)=0.4 |
---|---|---|
P3(6,6,3,2,2)=0.2 | P3(6,5,5,2,1)=0 | P3(6,5,4,3,1)=0.2 |
P3(6,5,4,2,2)=0 | P3(6,5,3,3,2)=0.4 | P3(6,4,4,4,1)=0 |
P3(6,4,4,3,2)=0.2 | P3(6,4,3,3,3)=0.6 | P3(5,5,5,3,1)=0.2 |
P3(5,5,4,4,1)=0 | P3(5,5,4,3,2)=0.2 | P3(5,4,4,4,2)=0 |
P3(5,4,4,3,3)=0.4 | P3(4,4,4,4,3)=0.2 | P3(5,5,5,2,2)=0 |
P3(5,5,3,3,3)=0.6 |
$$P(\qquad) = \frac{\sum_{m_i} P_3(m_i) W(m_i)}{\sum_{m_i} W(m_i)} = \frac{113}{605} \approx 18.7\%$$
here \(m_i\) are the mesostates that satisfy the constraint
This is quite complicated!
But what happens if we increase the number of dice?
Will it get more complicated or easier?
$$W(2n_1,\dots,2n_k) = \frac{(2n)!}{(2n_1)! (2n_2)! \dots (2n_k)!}$$
W2(6,6,5,1,1)=3150 | W2(6,6,4,2,1)=12600 | W2(6,6,3,3,1)=6300 |
---|---|---|
W2(6,6,3,2,2)=3150 | W2(6,5,5,2,1)=12600 | W2(6,5,4,3,1)=113400 |
W2(6,5,4,2,2)=12600 | W2(6,5,3,3,2)=12600 | W2(6,4,4,4,1)=420 |
W2(6,4,4,3,2)=12600 | W2(6,4,3,3,3)=420 | W2(5,5,5,3,1)=420 |
W2(5,5,4,4,1)=6300 | W2(5,5,4,3,2)=12600 | W2(5,4,4,4,2)=420 |
W2(5,4,4,3,3)=3150 | W2(4,4,4,4,3)=45 | W2(5,5,5,2,2)=210 |
W2(5,5,3,3,3)=210 |
Let us denote \(W_2(n_1,\dots,n_k) = W(2n_1,\dots,2n_k)\)
What happens to the multiplicities?
Note: there are also other mesostates that are not double-configurations
$$p(\epsilon_i) = \frac{1}{Z} \exp(-\beta \epsilon_i)$$
It is called Boltzmann distribution and Z is called partition function
The entropy is then $$S = -\sum_i p(\epsilon_i) \log p(\epsilon_i) = -\sum_i p(\epsilon_i) \left(- \beta \epsilon_i - \ln Z\right) = -\beta E - \ln Z$$
The Lagrange parameter \(\beta\) has to be determined from energy constraint
$$\sum_i \epsilon_i \frac{\exp(-\beta \epsilon_i)}{Z(\beta)} = E$$
The total energy of joint system \(A+B\) is conserved.
$$U = k T^2 \frac{\partial \ln Z}{\partial T}$$
$$F = - k T \ln Z$$
$$S = -\frac{\partial F}{\partial T}$$
$$m= \langle \sigma_i \rangle = \sum_{\sigma_i=\pm 1} \sigma_i \frac{\exp(-\beta H)}{Z}$$
$$Z(\sigma_i)=\sum_{\sigma_i=\pm 1}\exp(-\beta(J k m + h)\sigma_i)= \cosh(\beta( J k m + h)$$
Ferromagnetic
Paramagnetic
Ising antiferromagnetic model
ferromagnetic
antiferromagnetic
Potts model
XY model
Spin glass model
0. - Initialize the system (assign the initial spins \(\sigma\))
1. - Randomly choose one spin \(\sigma_i\)
2. - Try to flip the spin \(\sigma_i^{'} = -\sigma_i\)
3. - Calculate the change in Hamiltonian \(\Delta H = H(\sigma^{'})-H(\sigma)\)
4. With probability \(p^{(ac)}(T,\Delta H)\), accept the flip,
otherwise, reject the flip
5. Repeat steps 1-4
There are two main types of acceptance criteria
Binary opinions
- Opinions take two values (e.g., YES/NO, left/right)
- It usually applies in bipartisan political systems or binary choice
- they can be modeled with a binary spin \(\sigma_i=\pm 1\)
- similar to Ising model
Multistate opinions
- Opinions take multiple discrete values
- It usually applies in multi-partisan political systems or when multiple choices are available
- they can be modeled with a discrete spin \(\sigma_i\in \{1,2,\dots,k\}\)
- similar to Potts model
Continuous opinions
- Opinions take continuous values (typically [-1,1] or [0,1])
- It usually applies to preference systems or sentiments
- they can be modeled with a continuous spin
- similar to various spin models (e.g., XY model)
Homophily
- "Birds of a feather flock together"
- People tend to like other peers with similar opinions
- For opinion vector \(\sigma = \{\sigma_1,\dots,\sigma_k\}\) the term \( \sum_i \sigma_i \sigma_i^{'}\) is larger the more opinions are the same
- Equivalent to the Ising model
Bounded confidence
- Individuals adjust their opinions if they are sufficiently close to each other
- There is an interaction range where people tend to align
- We can express it by the Hamiltonian \(H(\sigma) = - \sum_{ij} J(\sigma_i,\sigma_j) \sigma_i \sigma_j\) where \(J(\sigma_i,\sigma_j) =1\) if \(|\sigma_i - \sigma_j| \leq \epsilon\) and 0 otherwise
Majority rule
- Individuals adopt the opinion of the majority in a group
- The update of the opinion can be expressed as \(\sigma = \max(\sigma^i | i \sim j)\) where \(i \sim j\) denotes all individuals that are neighbors of \(i\)
Social pressure
- Individuals tend to follow the opinions of influencers (e.g., on social media)
- This can be modeled similarly to an external field in an Ising model $$H(\sigma) = - \sum_i h_i \sigma_i$$ where \(h_i\) denotes the strength of a social pressure
Heider balance
- "friend of my friend is my friend, enemy of my enemy is my friend"
- People tend to prefer balanced triangle relationships
- This can be modeled by considering the cubic Ising model on links $$H(J) = - \sum_{ijk} J_{ij} J_{jk} J_{ki}$$
lattice
fully-connected
random
preferential attachment
small-world
Cohesive phase
Polarized phase
Fragmented phase
Voter model
Voter model
Sznajd model
Sznajd model
Axelrod model of culture dissemination
Axelrod model of culture dissemination
Social balance model
Social balance model
Some work we do here at CSH
Group size distribution
Some work we do here at CSH - emergent balance
The main idea: in large networks (like social networks) one cannot know all the relations between their friends
Some work we do here at CSH - group distribution
Hamiltonian of a group \(\mathcal{G}\)
\(H(\mathbf{s}_{i_1},\dots,\mathbf{s}_{i_k}) = \textcolor{red}{\underbrace{- \phi \, \frac{J}{2} \sum_{ij \in \mathcal{G}} A_{ij} \mathbf{s}_i \cdot \mathbf{s}_j}_{intra-group \ social \ stress}} \textcolor{blue}{ + \underbrace{(1-\phi) \frac{J}{2} \sum_{i \in \mathcal{G}, j \notin \mathcal{G}} A_{ij} \mathbf{s}_{i} \cdot \mathbf{s}_j}_{inter-group \ social \ stress}} \\ \qquad \qquad \qquad \qquad - \underbrace{h \sum_{i \in \mathcal{G}} \mathbf{s}_i \cdot \mathbf{w}}_{external \ field}\)
Group formation based on opinion= self-assembly of spin glass
Group 1
Group 2
friends
enemies
Theory
MC simulation
Some work we do here at CSH - group distribution
Application online multiplayer game PARDUS