Latent Space Geometric Statistics

Stefan Sommer, University of Copenhagen

w/ Line Kühnel, Tom Fletcher, Sarang Joshi

Faculty of Science, University of Copenhagen

ICPR Manlearn, 2020

Geometric statistics using latent representations

Supported by Novo nordisk foundation, Villum foundation, Carlsberg foundation, Lundbeck foundation

Data space
\(X\)

Latent space
\(Z\)

Statistical results
PGA

Using VAEs/GANs for nonlinear dimensionality reduction

Latent space encoding

  1. unsupervised training of an embedding \[F:Z\to X=\mathbb R^d\] from latent space to data space

Data space
\(X\)

Latent space
\(Z\)

Reconstructed data

encoder

decoder \(F\)

Latent geometry and geometric statistics

Latent space geometry:

  • embedding \(F\to X\)   (e.g. decoder in VAEs)
  • manifold if \(\mathrm{rank}\,dF\) constant \(=\mathrm{dim}\,Z\)
  • pullback Riemannian metric \[g(v,w)=(dFv)^TdFw\] on \(Z\)
  1. unsupervised training of an embedding \[F:Z\to X\] from latent space to data space
  2. apply pullback geometry to \(Z\)
  3. statistical analysis using the generated geometry

Shao et al.'18; Chen et al.'18,
Arvanitidis et al.'18

fit a nonlinear manifold and perform statistics in the resulting geometry

Geometric statistics using latent representations

sampled data on \(\mathbb S^2\)

trained manifold \(F(Z)\)

Geometric statistics

Generalization of Euclidean statistical notions and techniques to spaces without vector space structure

  • i.i.d. samples \(y_1,\ldots,y_N\in M\)
  • Fréchet mean
    \(\bar{x}=\mathrm{argmin}_{x\in M}\sum_{i=1}^Nd(x,y_i)^2\)

No equivalence between different characterizations of means

- in contrast to Euclidean statistics

From Euclidean to Riemannian

Euclidean

vectors

inner product

norm \(\|y-x\|\)

straight lines

linear subspaces

Riemannian

derivatives of curves

metric tensor

distance \(d(x,y)\)

geodesics

geodesic sprays

  • difference \(y-x\) vs. tangent vector \(\longleftrightarrow\) global vs. local

Geometric data: Examples

Plane directions:      \(\mathbb{S}^1\)

Geographical data:  \(\mathbb{S}^2\)

3D directions:           \(\mathrm{SO}(3), \mathbb{S}^2\)

Angles:                       \(\mathbb{T}^n\)

Tensors:              e.g. \(\mathrm{Sym}_+(n)\)

Diffusion mean on \(\mathbb S^2\)

  • \(x_t\in M\) Brownian motion
  • \(\theta=x_0\), \(y\sim x_T\)
  • \(\bar{x}_{\mathrm{ML}}=\mathrm{argmax}_\theta\mathcal{L}(\theta)\)

most likely starting point of Brownian motion

Sommer,IPMI'15; Sommer,Svane,JGM'15;
Sommer,GSI'17; Sommer,Sankhya A'19

Beyond the mean: Covariance and PCA

Non-Euclidean generalizations of PCA:

  • Principal Geodesic Analysis (PGA, Fletcher et al., ’04)
  • Geodesic PCA (GPCA, Huckeman et al., ’10)
  • Horizontal Component Analysis (HCA, Sommer, ’13)
  • Principal Nested Spheres ((C)PNS, Jung et al., ’12)
  • Barycentric Subspaces (BS, Pennec, ’15)
  • Probabilistic PGA (PPGA, Zhang et al'13)
  • Diffusion PCA (Sommer'18)

Computational techniques

  1. Automatic differentiation allows evaluating the Jacobian \(JF\) and hence the metric \(g=JF^TJF\)
  2. computationally very expensive if data dimension \(d\) large
  3. solution: train two networks \(g_{\mathrm{predicted}},g_{\mathrm{predicted}}^{-1}\) approximating the metric and cometric:




     
  4. geodesic computation from 30s to 30ms on MNIST
  5. implemented in Theano Geometry

Examples: \(\mathbb S^2\)

sampled data on \(\mathbb S^2\)

trained manifold \(F(Z)\)

geodesic

Brownian bridge

Examples: MNIST

60,000 handwritten digits

2D latent representation \(\quad F:Z\to\mathbb R^{784}\)

ML mean

Frechet mean

likelihood

Z

Brownian motion

Brownian bridge

scalar curvature

Ricci curvature (min eigv)

parallel transport in \(Z\)

Examples: Diatoms

780 landmark represented diatoms

2D latent representation \(\quad F:Z\to\mathbb R^{90}\)

Hotelling two-sample test:

data space

latent space

Latent Space Geometric Statistics

code: http://bitbucket.com/stefansommer/theanogeometry

slides: https://slides.com/stefansommer

References:

  • Sommer, Bronstein: Horizontal Flows and Manifold Stochastics in Geometric Deep Learning, TPAMI, 2020, doi: 10.1109/TPAMI.2020.2994507
  • Hansen, Eltzner, Huckemann, Sommer: Diffusion Means on Riemannian Manifolds, in preparation, 2020.
  • Højgaard Jensen, Sommer: Simulation of Conditioned Diffusions on Riemannian Manifolds, in preparation, 2020.
  • Sommer: Anisotropic Distributions on Manifolds: Template Estimation and Most Probable Paths, IPMI 2015, doi: 10.1007/978-3-319-19992-4_15.
  • Sommer, Svane: Modelling Anisotropic Covariance using Stochastic Development and Sub-Riemannian Frame Bundle Geometry, JoGM, 2017, arXiv:1512.08544.
  • Sommer: Anisotropically Weighted and Nonholonomically Constrained Evolutions, Entropy, 2017, arXiv:1609.00395 .
  • Sommer: Anisotropic Distributions on Manifolds: Template Estimation and Most Probable Paths, IPMI 2015, doi: 10.1007/978-3-319-19992-4_15.
  • Sommer, Svane: Modelling Anisotropic Covariance using Stochastic Development and Sub-Riemannian Frame Bundle Geometry, JoGM, 2017, arXiv:1512.08544.
  • Sommer: Anisotropically Weighted and Nonholonomically Constrained Evolutions, Entropy, 2017, arXiv:1609.00395 .
  • Arnaudon, Holm, Sommer: A Stochastic Large Deformation Model for Computational Anatomy, IPMI 2017, arXiv:1612.05323.
  • Arnaudon, Holm, Sommer: A Geometric Framework for Stochastic Shape Analysis, Foundations of Computational Mathematics, arXiv:1703.09971.
  • Arnaudon, Holm, Sommer: String Methods for Stochastic Image and Shape Matching. JMIV, 2018, arXiv:1805.06038.
  • Sommer, Joshi: Brownian Bridge Simulation and Metric Estimation on Lie Groups and Homogeneous Spaces, in preparation, 2018.
  • Sommer: An Infinitesimal Probabilistic Model for Principal Component Analysis of Manifold Valued Data, Sankhya A, arXiv:1801.10341.
  • Kuhnel, Fletcher, Joshi, Sommer: Latent Space Non-Linear Statistics, arXiv:1805.07632, 2018.
  • Højgaard Jensen, Mallasto, Sommer: Simulation of Conditioned Diffusions on the Flat Torus, GSI 2019., arXiv:1906.09813.
  • smooth, global support
  • Composition:
    \(k_2\ast_{W_{T/2}} (k_1\ast_{W_{T/2}} f)(u)=\)
        \(\mathrm{E}[k_2(-W_{T/2})k_1(-(W_T-W_{T/2}))f(U_T^u)]\)
  • Tensors: \(k^n_m\), \(f:OM\to\mathbb R^m\)
         \(y^n=\mathrm{E}[k^n(-W_t)f(U_T^u)]\)
  • Equivariance: \(a\in O(d)\)
        \(k\ast_{W_T} (a.f)(u)=a.(k\ast_{W_T} f)(u)\)
  • Non-linearities \(\phi_i\):
        \(\phi_n(k_n\ast_{W_{T/n}} \phi_{n-1}(\cdots \phi_1(k_1\ast_{W_{T/n}} f))(u)\)
  • precomputed density \(\rho\)

Geometry and deep learning:

Intrinsic convolutions

Sommer,Bronstein,TPAMI'20

Latent Space Geometric Statistics

By Stefan Sommer

Latent Space Geometric Statistics

  • 294