Stefan Sommer
Professor at Department of Computer Science, University of Copenhagen
Stefan Sommer, University of Copenhagen
Faculty of Science, University of Copenhagen
UMR CRIStAL, November, 2025
w/ Frank v.d. Meulen, Rasmus Nielsen, Christy Hipsley, Sofia Stoustrup, Libby Baker, Gefan Yang, Michael Severinsen, Jingchao Zhou
Villum foundation
Novo nordisk foundation
University of Copenhagen
Center for Computational Evolutionary Morphometrics
w/ Rasmus Nielsen
Brown. motion
Brown. motion
Brown. motion
Brown. motion
branch (independent children)
incorporate leaf observations \(x_{V_T}\) into probabilistic model:
\(p(X_t|x_{V_T})\)
Brown. motion
Brown. motion
Brown. motion
Brown. motion
1) What is a shape Brownian motion?
2) How do we condition the nonlinear process on shape observations?
3) How do we perform inference in the full model?
Stochastic processes that
action: \(\phi.s=\phi\circ s\) (shapes)
\(\phi.s=s\circ\phi^{-1}\) (images)
\( \phi \)
\( \phi \) warp of domain \(\Omega\) (2D or 3D space)
landmarks: \(s=(x_1,\ldots,x_n)\)
curves: \(s: \mathbb S^1\to\mathbb R^2\)
surfaces: \(s: \mathbb S^2\to\mathbb R^3\)
\( \phi_t:[0,T]\to\mathrm{Diff}(\Omega) \) path of diffeomorphisms (parameter t)
LDDMM: Grenander, Miller, Trouve, Younes, Christensen, Joshi, et al.
Markussen,CVIU'07; Budhiraja,Dupuis,Maroulas,Bernoulli'10
Trouve,Vialard,QAM'12;Vialard,SPA'13;Marsland/Shardlow,SIIMS'17
Arnaudon,Holm,Sommer,IPMI'17; FoCM'18; JMIV'19
Arnaudon,v.d. Meulen,Schauer,Sommer'21
geodesic ODE
perturbed SDE
Diffeomorphism
\[\phi_t(x)=x+X_t(x)\]Infinite noise Kunita flow:
\[dX_t = Q^{1/2}(X_t) \circ dW_t\]
\(Q^{1/2}(X_t)v(x) =\\\qquad \int_{D} k^{Q^{1/2}}(x+X_t(x),y) v(y) \, dy\)
Landmark shape process:
\[dX_t=\sqrt{K(X_t)}\circ dW_t\]
Kernel matrix
\[K(X_t)^i_j=k(x_i,x_j)\]encodes landmark covariance
\(X_t\) landmarks at time \(t\):
\[X_t=\begin{pmatrix}x_{1,t}\\y_{1,t}\\\vdots\\x_{n,t}\\y_{n,t}\end{pmatrix}\]
\(X_t\) (no conditioning)
\(X_t|X_T=v\) (conditioned)
fit neural network \( s_\theta \) to minimize
\[\mathcal{L}(\theta)= \frac{1}{2} \sum_{m=1}^M \int_{t_{m-1}}^{t_m}\left[\big\| s_\theta(t, x_t)-\nabla\log p(x_{t},t\mid x_{t_{m-1}},t_{m-1})\big\|_{a(t,x_t)}^2\right] dt\]
plug \( s_\theta \) into SDE to sample: \[ dx_t = b(t,x_t)dt +a(t,x_t)s_\theta(t,x_t) dt + \sigma(t,x_t)\,dW_t \]
Train a neural network to learn the score in the bridge SDE in inf. dim.
\[dx_t=b(t,x_t)dt+a(t,x_t)\nabla_x\log \rho(t,x_t)dt\\+\sigma(t,x_t)dW_t\]particularly for shape Kunita flows
Zhou,Yang,Sommer,GSI'25
Delyon/Hu 2006:
\(\sigma\) invertible:
\(v\)
\(x_0\)
\(x_t\)
Conditioning on hitting target \(v\) at time \(T>0\):
\[X_t|X_T=v\]
Ito stochastic process:
\[dx_t=b(t,x_t)dt\qquad\qquad\qquad\qquad\quad\\+\sigma(t,x_t)dW_t\]
True bridge:
\[dx^*_t=b(t,x^*_t)dt+a(t,x^*_t)\nabla_x\log \rho_t(x^*_t)dt\\+\sigma(t,x^*_t)dW_t\]
Score \(\nabla_x\log \rho_t\) intractable:
\[\rho_t(x)=p_{T-t}(v;x)\]
\[a(t,x)=\sigma(t,x)\sigma(t,x)^T\]
black: \(X_0\), red: \(v\)
Auxilary process:
\[d\tilde{x}_t=\tilde{b}(t,\tilde{x}_t)dt+\tilde{\sigma}(t,\tilde{x}_t)dW_t\]
Approximate bridge:
\[dx_t^\circ=b(t,x_t^\circ)dt+a(t,x_t^\circ)\nabla_x\log \tilde{\rho}_t(x_t^\circ)dt\\+\sigma(t,x_t^\circ)dW_t\]
for e.g. linear processes, score \(\nabla_x\log \tilde{\rho}_t\) is known in closed from
(almost) explicitly computable likelihood ratio:
\[\frac{d\mathbb P^*}{d\mathbb P^\circ}=\frac{\tilde{\rho}_T(v)}{\rho_T(v)}\Psi(x_t^\circ)\]
Backward filtering, forward guiding: van der Meulen, Schauer et al.
Ito stochastic process:
\[dx_t=b(t,x_t)dt+\sigma(t,x_t)dW_t\]
Bridge process:
\[dx^*_t=b(t,x^*_t)dt+a(t,x^*_t)\nabla_x\log\rho_t(x^*_t)dt\\+\sigma(t,x^*_t)dW_t\]
Score \(\nabla_x\log \rho_t\) intractable, but ...
v.d. Meulen,Schauer,Arnaudon,Sommer,SIIMS'22
Bridge:
Leaf conditioning:
\(x_0\)
\(v\)
\(x_0\)
\(h\)
\(v_1\)
van der Meulen, Schauer'20; van der Meulen'22
Stoustrup, Nielsen, van der Meulen, Sommer
\(v_2\)
recursive,leaves to root
Backwards filter:
root to leaves
Forward guiding:
\(v\)
\(v_1\)
\(v_2\)
\(h\)
\(x_0\)
Brown. motion
Brown. motion
Brown. motion
Brown. motion
branch (independent children)
incorporate leaf observations \(x_{V_T}\) into probabilistic model:
\(p(X_t|x_{V_T})\)
Doob’s h-transform
\(h_s(x)=\prod_{t\in\mathrm{ch(s)}}h_{s\to t}(x)\)
conditioned process \(X^*_t\)
approximations \(\tilde{h}\)
guided process \(X^\circ_t\)
Messages:
Up:
Fuse:
v.d. Meulen,Schauer,Sommer,'25
sample parameters (e.g. kernel width, amplitude)
v.d. Meulen,Schauer,Arnaudon,Sommer,SIIMS'22
Severinsen, Hipsley, Nielsen, Sommer
Brown. motion
Brown. motion
Brown. motion
Brown. motion
1) What is a shape Brownian motion?
2) How do we condition the nonlinear process on shape observations?
3) How do we perform inference in the full model?
Yang,van der Meulen,Sommer,ICML'25
Severinsen, Hipsley, Nielsen, Sommer
Diffusion mean
Most probable paths
Eltzner, Huckemann, Grong, Corstanje,van der Meulen,Schauer,Sommer et al.
Manifold bridges
Jax magic... in milliseconds:
JaxGeometry: https://github.com/computationalevolutionarymorphometry/jaxgeometry CCEM: http://www.ccem.dk
Hyperiax: https://github.com/computationalevolutionarymorphometry/hyperiax slides: https://slides.com/stefansommer
References:
By Stefan Sommer
Professor at Department of Computer Science, University of Copenhagen