Global hierarchy vs. local structure:

Spurious self-feedback in Barabási-Albert Ising networks

Claudia Merger, Timo Reinartz, Stefan Wessel, Andreas Schuppert, Carsten Honerkamp, Moritz Helias

Barabási-Albert networks

 

 

 

 

 

 

 

+

Ising model

Barabási-Albert networks

have hubs!

 

 

 

 

  

 

 

 

k_{max} = m_0 \sqrt{N} \\ \\ \langle k \rangle = 2 m_0

Complicated connectivity

 

 

 

 

 

 

 

 

 

x_i \in \{-1,1\} \\ \\ p(x)\propto \exp \left(\beta \sum_{ij} A_{ij}x_i x_j \right)
A_{ij}

Monte-Carlo

Effective action

Methods

High temperature expansion

"Ground truth"

Monte-Carlo

Methods

"Ground truth"

Problem of "freezing hubs"

 

 

 

 

 

 

Timo Reinartz, Stefan Wessel

Monte-Carlo

"Ground truth"

"conventional" Metropolis Monte Carlo: Hubs freeze out

Parallel Tempering

Swedensen and Wang,1986

\( T \)

\(\beta_i \)

\(\beta_j \)

\( p_{ij} = \min \left( 1, e^{(E_i-E_j)(\beta_i -\beta_j)}\right) \)

 

 

 

 

 

 

 

Timo Reinartz, Stefan Wessel

Effective action

Methods

High temperature expansion

Mean-field self-consistency

m_i=\tanh \bigg( K_0 \beta \sum_{j \neq i} A_{i,j} m_j \bigg)
\langle x_i \rangle = m_i

Find transition temperature!

 

m_i = K_0 \beta_T \sum_{j \neq i} A_{i,j} m_j

Local structure

\( \rightarrow \) solve for largest eigenvalue of \( A \)

Goh, Kahng, Kim, 2001

\tiny T_T = K_0 \lambda_A \sim \sqrt[4]{N}

"onion structure of eigenvector"

\( N = 10^4 \)

m_i = K_0 \beta_T \sum_{j \neq i} A_{i,j} m_j

Local structure

\( \rightarrow \) solve for largest eigenvalue of \( A \)

Bianconi, 2002

Goh, Kahng, Kim, 2001

\tiny T_T = K_0 \lambda_A \sim \sqrt[4]{N}

"onion structure of eigenvector"

\( N = 10^4 \)

degreewise parametrization


 \( A_{ij} \leftrightarrow p_c(k_i,k_j)=\frac{k_i k_j}{2 m_0 N} \)



\tiny T_T = K_0 \frac{\langle k^2\rangle}{2 m_0} \\ \quad \quad \quad= K_0 \frac{m_0}{2} \log(N)
m_i = K_0 \beta_T \sum_{j \neq i} A_{i,j} m_j

Local structure

\( \rightarrow \) solve for largest eigenvalue of \( A \)

degreewise parametrization

 \( A_{ij} \leftrightarrow p_c(k_i,k_j)=\frac{k_i k_j}{2 m_0 N} \)

 

 

\tiny T_T = K_0 \frac{m_0}{2} \log(N)

Bianconi, 2002

Goh, Kahng, Kim, 2001

\tiny T_T = K_0 \lambda_A \sim \sqrt[4]{N}

"onion structure of eigenvector"

Monte-Carlo:

Transition Temperature?

m_i = K_0 \beta_T \sum_{j \neq i} A_{i,j} m_j

Local structure

\( \rightarrow \) solve for largest eigenvalue of \( A \)

degreewise parametrization

 

"onion structure of eigenvector"

Monte-Carlo:

Onion structure?

 

\( m_i  \sim k_i \)

TAP self-consistency

\tiny m_i=\tanh \bigg( K_0 \beta \sum_{j \neq i} A_{i,j} m_j -m_i K_0^2 \beta^2 \sum_{j \neq i} A_{i,j}^2 (1-m_j^2) \bigg)

TAP

mean-field

\( \rightarrow \) solve for special \(A\)

Find transition temperature!

 

Full TAP solution

\( \rightarrow \) Good general agreement

between Monte-Carlo and TAP

(better than mean-field)

Why is TAP better than local meanfield?

 

\tiny m_i=\tanh \bigg( K_0 \beta \sum_{j \neq i} A_{i,j} m_j -m_i K_0^2 \beta^2 \sum_{j \neq i} A_{i,j}^2 (1-m_j^2) \bigg)

TAP

mean-field

m_{i}=\beta K_{0}\sum_{j}\left(A_{ij}m_{j}-\beta K_{0}\delta_{ij}k_{i}m_{i}\right)

TAP

mean-field

expand \(  m_j \) around \(  m_i =0 \)

\( m_{i}=\beta K_{0}\sum_{j}\left[A_{ij}\left(\quad\Big|_{m_{i}=0}+m_{i}A_{ji}\beta K_{\text{0}}\right)-\beta K_{0}\delta_{ij}k_{i}m_{i}\right]+ \mathcal{O} (\beta^3 K_0^3).\)

\(m_j \)

\( i\)

\( j\)

m_{i}=\beta K_{0}\sum_{j}\left(A_{ij}\quad \, \,-\beta K_{0}\delta_{ij}k_{i}m_{i}\right)

TAP

mean-field

m_j

expand \(  m_j \) around \(  m_i =0 \)

m_{i}=\beta K_{0}\underbrace{\sum_{j}A_{ij}\quad \Big|_{m_{i}=0}}_{\text{field in the absence of }i}

\( m_{i}=\beta K_{0}\sum_{j}\left[A_{ij}\left(\quad\Big|_{m_{i}=0}+m_{i}A_{ji}\beta K_{\text{0}}\right)-\beta K_{0}\delta_{ij}k_{i}m_{i}\right]+ \mathcal{O} (\beta^3 K_0^3).\)

\(m_j \)

m_j

\( i\)

\( j\)

\( \leftrightarrow \) Mezard, Parisi

and Virasoro, 1978

m_{i}=\beta K_{0}\sum_{j}\left(A_{ij}\quad \, \,-\beta K_{0}\delta_{ij}k_{i}m_{i}\right)

TAP

mean-field

m_j

Local structure

\tiny m_i = \beta K_0 \sum B_{ij} m_j \\ B_{ij}=A_{ij}-\beta K_{0}\delta_{ij}k_{i}

\( \rightarrow \) solve : \( T_T = K_0 \lambda_{B, max} (T_T) \)

m_{i}=\beta K_{0}\sum_{j}\left(A_{ij}m_{j}-\beta K_{0}\delta_{ij}k_{i}m_{i}\right)

degreewise parametrization

 insert \( A_{ij} \leftrightarrow p_c(k_i,k_j)=\frac{k_i k_j}{2 m_0 N} \) into

\( m_{i}=\beta K_{0}\underbrace{\sum_{j}A_{ij}m_{j}\Big|_{m_{i}=0}}_{\text{field in the absence of }i\, \approx S} \sim k_i S \)

 

\tiny T_T= K_0 \frac{m_0}{2} \log(N)

\( \rightarrow\) Same as meanfield

Global effective field \( S \)

\( S(\beta K_{0})= \frac{1}{\langle k \rangle}\langle k\,m \left(k \right)\rangle_{p(k)} \)

 

\( m(k)= \tanh\left(\beta K_{0}\,k\,S(\beta K_{0})\right) \)

 

\( M (S) = \langle \tanh\left(\beta K_{0}\,k\,S(\beta K_{0})\right)\rangle_{p(k)} \)

 

 

 

recover local structure:

\( m_{ NN, h} (k)= \tanh( \beta K_{0} \,(k-1) \, S( \beta K_{0} )+\beta K_{0} m(k_{h}) ) \)

Summary and outlook

  • Self-consistent solution without self-feedback

 

 

 

  • local "onion" structure plays minor role

 

 

 

Hierarchical nature of connectivity dominates the behaviour

 

 

 

Summary and outlook

  • Self-consistent solution without self-feedback

 

 

 

  • local "onion" structure plays minor role

 

 

 

Hierarchical nature of connectivity dominates the behaviour

 

personal takeaway:

  1.  Know what confuses you
  2.  Never give up hope for a simple explanation

 

Averaging over all configurations

W(j) = \log \left( \sum_x p(x) \exp(j^Tx) \right)\\ \\
\langle x_i \rangle = m_i
\Gamma(m)= \sup_j j^Tm -W(j)

Cumulant generating function

Effective action

\partial_{m_i} \Gamma(m) = 0 \quad \forall i