CVIP 2.0
\( \text{Agenda of this Lecture:}\)
Posterior
Generative Model
Posterior
Generative Model
Data reconstruction using VAEs
Diffusion Models
CVIP 2.0
Expectation
Jensen's Inequality
KL Divergence
VAE Loss
Notation
What does \( \beta_t \) do?
Forward and Reverse Processes
Forward Process
Forward Process
Reverse Process
Diffusion Loss
Diffusion Loss
Calvin Luo's
Diffusion Tutorial
Diffusion Loss
Diffusion Loss
Reverse Process Update
Summary
References
CVIP 2.0
\( \text{Agenda of this Lecture:}\)
Gaussian Variable
Gaussian Variable
\( \mathcal{L}_{\text{VAE}} Â = \text{Reconstruction} + \text{Prior Matching} \)
\( \mathcal{L}_{\text{Diff}} Â = \text{Reconstruction} + \text{Prior Matching} + \text{Noise Matching} \)
Gaussian Variable
Gaussian Variable
\( \mathcal{L}_{\text{VAE}} Â = \text{Reconstruction} + \text{Prior Matching} \)
\( \mathcal{L}_{\text{Diffusion-Training}} Â = \text{Noise Matching} \)
Unconditional Image Generation
Classifier Guidance
UNet
UNet
Cross Attention Maps
Cross Attention Maps for Editing