["DESI 2024 VI: Cosmological Constraints from the Measurements of Baryon Acoustic Oscillations" arXiv:2404.03002]
What role did Machine Learning play?
Dark Energy is constant over time
1-Dimensional
Machine Learning
Secondary anisotropies
Galaxy formation
Intrinsic alignments
DESI, DESI-II, Spec-S5
Euclid / LSST
Simons Observatory
CMB-S4
Ligo
Einstein
xAstrophysics
5-Dimensional
Dataset Size = 1
Can't poke it in the lab
Simulations
Bayesian statistics
Unicorn land The promise of ML for Cosmology
Reality Check Roadblocks & Bottlenecks
Mapping dark matter
Reverting gravitational evolution
Learning to represent baryonic feedback
Data-driven hybrid simulators
Unsupervised problems
Fast Emulators + Compression + Field Level Inference
[Image Credit: Claire Lamman (CfA/Harvard) / DESI Collaboration]
AbacusSummit
330 billion particles in 2 Gpc/h volume
60 trillion particles
~ 8TBs per simulation
15M CPU hours
(TNG50 ~100M cpu hours)
ML Requirements
WANTED
Fast field level emulators
Compression methods
Likelihood estimators
Simulation efficient methods
Model
Training Samples
Evaluate probabilities
Low p(x)
High p(x)
Generate Novel Samples
Probability mass conserved locally
1) Tractable
2) f maximally expressive
Loss = Maximize likelihood training data
Continuity Equation
Loss requires solving an ODE!
Diffusion, Flow matching, Interpolants... All ways to avoid this at training time
[Image Credit: "Understanding Deep Learning" Simon J.D. Prince]
Can we regress the velocity field directly?
Turned maximum likelihood into a regression problem!
Interpolant
Stochastic Interpolant
Expectation over all possible paths that go through xt
["Stochastic Interpolants: A Unifying framework for flows and diffusion" Albergo et al arXiv:2303.08797]
Reverse diffusion: Denoise previous step
Forward diffusion: Add Gaussian noise (fixed)
Prompt
A person half Yoda half Gandalf
Denoising = Regression
Fixed base distribution:
Gaussian
["A point cloud approach to generative modeling for galaxy surveys at the field level"
Cuesta-Lazaro and Mishra-Sharma
ICML AI4Astro 2023, arXiv:2311.17141]
Base Distribution
Target Distribution
Long range correlations
Huge pointclouds (20M)
Homogeneity and isotropy
Siddharth Mishra-Sharma
Diffusion model
CNN
Diffusion
Increasing Noise
["Diffusion-HMC: Parameter Inference with Diffusion Model driven Hamiltonian Monte Carlo" Mudur, Cuesta-Lazaro and Finkbeiner NeurIPs 2023 ML for the physical sciences, arXiv:2405.05255]
Nayantara Mudur
CNN
Diffusion
["A Cosmic-Scale Benchmark for Symmetry-Preserving Data Processing"
Balla, Mishra-Sharma, Cuesta-Lazaro et al
NeurIPs 2024 NuerReps arXiv:2410.20516]
E(3) Equivariant architectures
Benchmark models
["Geometric and Physical Quantities Improve E(3) Equivariant Message Passing" Brandstetter et al arXiv:2110.02905]
The biggest lesson that can be read from 70 years of AI research is that general methods that leverage computation are ultimately the most effective, and by a large margin. [...]
methods that continue to scale with increased computation even as the available computation becomes very great. [...]
We want AI agents that can discover like we can, not which contain what we have discovered.
["A Cosmic-Scale Benchmark for Symmetry-Preserving Data Processing"
Balla, Mishra-Sharma, Cuesta-Lazaro et al
NeurIPs 2024 NuerReps arXiv:2410.20516]
Graph
Nodes
Edges
3D Mesh
Voxels
Both data representations scale badly with increasing resolution
Continuous in space and time
x500 Compression!
More than fast emulators:
Robust field-level likelihood models
Need to compress data:
Continuous fields ~1000?
Need to improve data efficiency:
Incorporating symmetries helps
1 to Many:
Galaxies
Dark Matter
["Debiasing with Diffusion: Probabilistic reconstruction of Dark Matter fields from galaxies"
Ono et al (including Cuesta-Lazaro)
NeurIPs 2024 ML for the physical Sciences arXiv:2403.10648]
Victoria Ono
Core F. Park
Truth
Sampled
Observed
Small
Large
Scale (k)
Power Spectrum
Small
Large
Scale (k)
Cross correlation
TNG-300
True DM
Sample DM
Size of training simulation
1) Generalising to larger volumes
Model trained on Astrid subgrid model
2) Generalising to subgrid models
["3D Reconstruction of Dark Matter Fields with Diffusion Models: Towards Application to Galaxy Surveys" Park, Mudur, Cuesta-Lazaro et al ICML 2024 AI for Science]
Posterior Sample
Posterior Mean
Stochastic Interpolants
NF
["Joint cosmological parameter inference and initial condition reconstruction with Stochastic Interpolants" Cuesta-Lazaro, Bayer, Albergo et al NeurIPs 2024 ML for the Physical Sciences]
?
["Probabilistic Forecasting with Stochastic Interpolants and Foellmer Processes" Chen et al arXiv:2403.10648 (Figure adapted from arXiv:2407.21097)]
Generative SDE
Guided simulations with fuzzy constraints
["Joint cosmological parameter inference and initial condition reconstruction with Stochastic Interpolants" Cuesta-Lazaro, Bayer, Albergo et al NeurIPs 2024 ML for the Physical Sciences]
How do we learn what is the robust information?
Simulating dark matter is easy!
"Atoms" are hard" :(
N-body Simulations
Hydrodynamics
Can we improve our simulators in a data-driven way?
(if cold!)
~ Gpc
pc
kpc
Mpc
Gpc
[Video credit: Francisco Villaescusa-Navarro]
Gas density
Gas temperature
Mikhail Ivanov
Robust galaxy bias model: Effective field Theories
+ Simulation as priors
Field-level EFT
["Full-shape analysis with simulation-based priors: constraints on single field inflation from BOSS" Ivanov, Cuesta-Lazaro et al arXiv:2402.13310]
Andrej Obuljen
Michael Toomey
["Full-shape analysis with simulation-based priors: cosmological parameters and the structure growth anomaly" Ivanov, Obuljen, Cuesta-Lazaro, Toomey arXiv:2409.10609]
Informative abstractions of the data
Transfer learning beyond LCDM
Cosmic web Anomaly Detection
Representing baryonic feedback
General
Predictive
Low dimensional?
Should generalize across scales, systems...
Transfer to unseen conditions
p(x|z)
Simple : Occam's razor
Causal?
Copernican
Ptolomeic
Contrastive
Generative
inductive biases
from scratch or from partial observations
Students at MIT are
OVER-CAFFEINATED
NERDS
SMART
ATHLETIC
Simulator 1
Simulator 2
Dark Matter
Feedback
i) Contrastive
Baryonic fields
ii) Generative
Baryonic fields
Dark Matter
Generative model
Total matter, gas temperature,
gas metalicity
Encoder
Parity violation cannot be originated by gravity
["Measurements of parity-odd modes in the large-scale 4-point function of SDSS..." Hou, Slepian, Chan arXiv:2206.03625]
["Could sample variance be responsible for the parity-violating signal seen in the BOSS galaxy survey?" Philcox, Ereza arXiv:2401.09523]
Real or Fake?
x or Mirror x?
Train
Test
Me: I can't wait to work with observations
Me working with observations:
Very subtle effect -> Hard to find data efficient architectures
1. There is a lot of information in galaxy surveys that ML methods can access
2. We can tackle high dimensional inference problems so far unatainable
3. Our ability to simulate limits the amount of information we can robustly extract
Hybrid simulators, forward models, robustness
Unsupervised problems
Mapping dark matter, constrained simulations... Let's get creative!
Generative Models = Fast emulators + Field level likelihoods
Symmetries can help reduce number of training simulations
Small
Large
In-Distribution
In-Distribution
In-Distribution
Out-of-Distribution
Out-of-Distribution
Out-of-Distribution
Out-of-Distribution
Out-of-Distribution
Out-of-Distribution