Learning Bayes-optimal

dendritic opinion pooling

Jakob Jordan,

João Sacramento, Willem Wybo,

Mihai A. Petrovici* & Walter Senn*

Department of Physiology, University of Bern, Switzerland

Kirchhoff-Institute for Physics, Heidelberg University, Germany

Institute of Neuroinformatics, UZH / ETH Zurich, Switzerland

Cue integration is a fundamental computational principle of cortex

Neurons with conductance-based synapses

naturally implement probabilistic cue integration

An observation

Bayes-optimal inference

Bidirectional voltage dynamics

Membrane potential dynamics from noisy gradient ascent

\begin{array}{rl} p(u_\text{s}|W,r) =& \frac{1}{Z'} \prod_{d=0}^D p_d(u_\text{s}|W_d,r) \\ =& \frac{1}{Z} e^{-\frac{\bar g_\text{s}}{2}\left( u_\text{s} - \bar E_\text{s}\right)^2} \end{array}
\begin{array}{rl} C \dot u_\text{s} =& \frac{\partial}{\partial u_\text{s}} \log p(u_\text{s}| W,r) + \xi \\ =& \sum_{d=0}^D \left( g_d^\text{L} (E^\text{L} - u_\text{s}) + g_d^\text{E} (E^\text{E} - u_\text{s}) + g_d^\text{I} (E^\text{I} - u_\text{s}) \right) + \xi \end{array}
\mathbb{E}[u_\text{s}] = \bar E_\text{s}

Average membrane potentials

= reliability-weighted opinions

Membrane potential variance

= 1/total reliability

\text{Var}[u_\text{s}] = \frac{1}{\bar g_\text{s}}

Synaptic plasticity from stochastic gradient ascent

\begin{array}{rl} \dot W_d^\text{E/I} \propto \frac{\partial}{\partial W_d^\text{E/I}} \log p(u_\text{s}^*|W,r) \\ \end{array}
\Delta \mu^{\text{E/I}} \propto (u_\text{s}^* - \bar E_\text{s}) \left( E^\text{E/I} - \bar E_\text{s} \right) \\
\Delta \sigma^2 \propto \frac{1}{2} \left( \frac{1}{\bar g_\text{s}} - (u_\text{s}^* - \bar E_\text{s})^2 \right)

Synaptic plasticity modifies excitatory/inhibitory synapses

  • in approx. opposite directions to match the mean
  • in identical directions to match the variance
\propto [ \; \Delta \mu^\text{E/I} \, + \, \Delta \sigma^2 \; ] \, r

\(u_\text{s}^*\): sample from target distribution \(p^*(u_\text{s})\)

target

actual

Synaptic plasticity performs

error-correction and reliability matching

Learning Bayes-optimal inference of orientations from multimodal stimuli

The trained model approximates ideal observers

and reproduces psychophysical signatures of experimental data

[Nikbakht et al., 2018]

Cross-modal suppression as

reliability-weighted opinion pooling

The trained model exhibits cross-modal suppression:

  • at low stimulus intensities, firing rate is larger bimodal condition
  • at high stimulus intensities, firing rate is smaller in bimodal condition
  • example prediction for experiments: strength of suppression depends on relative reliabilities of the two modalities

[Ohshiro et al., 2017]

Summary & Outlook

  • Neuron models with conductance-based synapses naturally implement computations required for probabilistic cue integration
  • Our plasticity rules matches the somatic potential distribution to a target distribution & weights pathways according to reliability
  • A model trained in a multisensory cue integration tasks reproduces behavioral and neuronal experimental data
  • The direct connection between normative and mechanistic descriptions allows for predictions on the systems as well as cellular level

Summary & Outlook

  • Neuron models with conductance-based synapses naturally implement computations required for probabilistic cue integration
  • Our plasticity rules matches the somatic potential distribution to a target distribution & weights pathways according to reliability
  • A model trained in a multisensory cue integration tasks reproduces behavioral and neuronal experimental data
  • The direct connection between normative and mechanistic descriptions allows for predictions on the systems as well as cellular level
  • Next: work out (new) detailed pre-/"post"dictions for specific experimental setups
  • Analog neuromorphic systems present a fitting substrate: non-linear differential eq. tricky to integrate

[blog-thebrain.org]

[Billaudelle et al., 2020]

NICE 2021

By jakobj