introductory talk
Workshop "Information-theoretic Methods for Complexity Science" April 29-May 1, 2019, Vienna
"You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
John von Neuman's reply to Claude Shannon's question how to name newly discovered measure of missing information
Please send the slides to jan.korbel@meduniwien.ac.at
after your talk, we will distribute it to everybody
J.K., R.H., S.T. New J. Phys. 20 (2018) 093007
Process | S(W) | |||
---|---|---|---|---|
Random walk |
0 |
1 |
0 |
|
Aging random walk |
0 |
2 |
0 |
|
Magnetic coins * |
0 |
1 |
-1 |
|
Random network |
0 |
1/2 |
0 |
|
Random walk cascade |
0 |
0 |
1 |
\( \log W\)
\( (\log W)^2\)
\( (\log W)^{1/2}\)
\( \log \log W\)
\(d_0\)
\(d_1\)
\(d_2\)
\( \log W/\log \log W\)
* H. Jensen et al. J. Phys. A: Math. Theor. 51 375002
\( W(N) = 2^N\)
\(W(N) \approx 2^{\sqrt{N}/2} \sim 2^{N^{1/2}}\)
\( W(N) \approx N^{N/2} e^{2 \sqrt{N}} \sim e^{N \log N}\)
\(W(N) = 2^{\binom{N}{2}} \sim 2^{N^2}\)
\(W(N) = 2^{2^N}-1 \sim 2^{2^N}\)
R.H., S.T. EPL 93 (2011) 20006
To fulfill SK axiom 2 (maximality): \(d_l > 0\), to fulfill SK axiom 3 (expandability): \(d_0 < 1\)
S.T., B.C.-M., R.H. Phys. Rev. E 96 (2017) 093007
B.C.-M., R.H., S.T. PNAS 112(17) (2015) 5348
$$S_{IT}(P) \sim 1 +1/2 \log W $$
$$S_{EXT}(P) \sim H(P) $$
$$S_{MEP}(P) =- \sum_{k=2}^W \left[p_i \log\left(p_i/p_1\right) + (p_1-p_i) \log\left(1-p_i/p_1\right)\right]$$
S.T., B.C.-M., R.H. Phys. Rev. E 96 (2017) 093007
R.H., S.T. Entropy 20(11) (2018) 838
Axiomatization from the Information theory point of view
There are many generalizations of 4th axiom:
Group composability*: \(S(A \cup B) = \phi(S(A) ,S(B))\) for independent \(A\) and \(B\)
Asymptotic scaling: (c,d)-entropies
* P. Tempesta, Proc. R. Soc. A 472 (2016) 2195
P.J., J.K. Phys. Rev. Lett. 122 (2019), 120601
Are the axioms set by theory of information and statistical inference different or can we find some overlap?
Let us consider the 4th SK axiom
in the form equivalent to composability axiom by P. Tempesta:
4. \(S(A \cup B) = f[f^{-1}(S(A)) \cdot f^{-1}(S(B|A))]\)
\(S(B|A) = S(B)\) if B is independent of A.
Entropies fulfilling SK and SJ: $$S_q^f(P) = f\left[\left(\sum_i p_i^q\right)^{1/(1-q)}\right] = f\left[\exp_q\left( \sum_i p_i \log_q(1/p_i) \right)\right]$$
work in progress
Example: information metric for (c,d)-entropies
\( g_{ij}(p) = \frac{\partial^2 D(p||q)}{\partial q_i \partial q_j}|_{p=q} \)
J.K., R.H., S.T. Entropy 21(2) (2019) 112