Dimitrije Marković
Bernstein Conference 2020
Satellite Workshop: "Dynamic probabilistic inference in the brain"
Introduce a computational model that represents temporal structure of a dynamic environment.
Infer learned temporal structure from human behaviour.
Recent empirical evidence of neuronal circuitry supporting anticipatory behaviour:
Accurate temporal representation \(\rightarrow\) anticipating events.
Marković, et al. PLoS computational biology (2019).
Temporal expectations and their impact on behaviour:
two hidden states
\( s_t \in \{A, B\}\)
Hidden semi-Markov model
Transition probability
\[ p(s_{t+1}|s_t, f_t) = \left\{ \begin{array}{ll} I_2, & \text{ for } f_t < n+1 \\ J_2 - I_2, & \text{ for } f_t = n + 1 \end{array} \right. \]
Duration probability
\[ p(f_{t+1}|f_t) \rightarrow p(d) \]
Phase transitions
\[p(f_t|f_{t-1})\]
M Varmazyar, et al., Journal of Industrial Engineering International (2019).
Phase transitions
\[p(f_t|f_{t-1})\]
Duration distribution
\[p(d) = {d + n - 2 \choose d-1}(1-\delta)^{d-1}\delta^n\]
M Varmazyar, et al., Journal of Industrial Engineering International (2019).
\[p(d) = NB(\mu, n)\]
\[\delta_\tau = p(s_\tau = B| s_0=A)\]
K Friston, et al., Neural computation (2017).
history of past outcomes and choices \( H_{t-1} = (o_{t-1:1}, a_{t-1:1}) \)
belief updating (Bayes rule)
\[ p\left(s_{t}, f_{t}| H_{t} \right) = \frac{p\left(o_{t}| s_{t}, a_{t}\right)p\left(s_{t}, f_{t}| H_{t-1} \right)}{p\left(o_{t}| a_{t}, H_{t-1} \right)} \]
Generative process
Action selection
A Gelman, et al., Statistica sinica (1996).
Observed participant's responses
\( A_T = (a^*_1, \ldots, a^*_T) \)
Posterior predictive sampling
\[\vec{\theta}_i, n_i \sim p(\vec{\theta}, n| A_T)\]
\[ \tilde{a}^i_t \sim p(a_t|H_{t-1}, \vec{\theta}_i, n_i) \]
Posterior estimate over model paramters
expected choice value
expected information gain
Friston, Karl, et al. Neural computation (2017).
learning phase
model fitting
model testing
Condition with regular reversals
Condition with irregular reversals
duration [d]
duration [d]
Thanks to: