RLG Short Talk
May 3, 2024
Adam Wei
Goal: Show how the different interpretations of diffusion are connected.
Diffusion
Sohl-Dickstein et al
DDPM
Ho et al
SMLD
Song et al
SDE
Song et al
Projection
Permenter et al
2015
2019
2021
2023
Denoising
Score-based
Optimization
2020
Goal: Show how the different interpretations of diffusion are connected.
2. Score-based and denoiser approaches are discrete instantiations of the SDE approach
1. Both SMLD and DDPM learn the score function of the noisy distributions.
Diffusion
Sohl-Dickstein et al
DDPM
Ho et al
SMLD
Song et al
SDE
Song et al
Projection
Permenter et al
2015
2019
2021
2023
Denoising
Score-based
Optimization
2020
Goal: Show how the different interpretations of diffusion are connected.
Intuition: On each step..
Key Takeaway: under regularity conditions
"Score Function"
Goal: Sample from some distribution p
"Gradient ascent with Gaussian noise"
Manifold Hypothesis: real world data lies along low-dimensional manifolds in high dimensional spaces.
Practical Implications
Solution: noising! (increase support of p)
1. Construct sequence of noised distributions:
2. Learn scores for noised distributions \(q_{\sigma_i}\) :
3. Sample from annealed Langevin Dynamics
Advantages
replace \(\nabla \log p(X)\) with "noisy scores"
1. Design a trajectory in distribution space
2. Use Langevin Dynamics to track this trajectory
TODO: DRAW IMAGE
Diffusion
Sohl-Dickstein et al
DDPM
Ho et al
SMLD
Song et al
SDE
Song et al
Projection
Permenter et al
2015
2019
2021
2023
Denoising
Score-based
Optimization
2020
Goal: Show how the different interpretations of diffusion are connected.
Forward Process: Noise the data
Backward Process: Denoising
SMLD Sampling:
DDPM Sampling:
Screenshots from the papers. Notation is not same!!
SMLD Loss
DDPM Loss
Screenshots of original loss functions. Notation is not same!!
SMLD Loss*:
DDPM Loss*:
*Rewritten loss functions with severe abuse of notation to highlight similarities.
Key Takeaway
Diffusion
Sohl-Dickstein et al
DDPM
Ho et al
SMLD
Song et al
SDE
Song et al
Projection
Permenter et al
2015
2019
2021
2023
Denoising
Score-based
Optimization
2020
Goal: Show how the different interpretations of diffusion are connected.
SMLD/DDPM: Noise and denoise in discrete steps.
SDE: Noise and denoise according to continuous SDEs
Key Takeaway: SMLD and DDPM are discrete instances of the SDE interpretation!
[1] Bernt Øksendal, 2003 [2] Brian D O Anderson, 1982
1. If f and g are Lipschitz, the Ito-SDE has a unique solution [1].
2. The reverse-time SDE is [2]:
"Ito-SDE"
1. Define f and g s.t. X(0) ~ p transforms into a tractable distribution, X(T)
2. Learn
3. To sample:
- Draw sample x(T)
- Reverse the SDE until t=0
SMLD/DDPM
Forward (N-steps):
SMLD as an Ito SDE...
Forward (N-steps):
Similarly for DDPM...
SDE Interpretation:
Forward:
Reverse:
SMLD
DDPM
2. Score-based and denoiser approaches are discrete instantiations of the SDE approach
1. Both SMLD and DDPM learn the score function of the noisy distributions.
Papers: Sohl-Dickstein, SMLD, DDPM, SDE, Permenter, Understanding Diffusion Models
Blogs: Lillian Weng, Yang Song, Peter E. Holderrieth
Tutorials: Yang Song, "What are Diffusion Models"
Diffusion in Robotics: https://github.com/mbreuss/diffusion-literature-for-robotics