Course Resources
This page contains curated resources to support your learning throughout the course. Resources are organized into monographs, tutorials, and research papers. This list will be updated throughout the class.
π Books
2025
Comprehensive monograph covering diffusion models, flow matching, and transport-based generative modeling from first principles.
π Courses
2023
Stanford course on generative models including VAEs, GANs, EBMs, normalizing flows, diffusion models, and autoregressive models.
2025
CMU course on generative models including LLMs, GANs, and diffusion models.
2025
CMU course on generative models including LLMs, VAEs, and diffusion models.
2025
MIT class on diffusion and flow matching from a flow-based theoretical perspective.
2025
CMU course that focuses on probabilistic modeling (including some deep generative models from a more theoretical perspective).
2024
Stanford course that focuses on probabilistic modeling.
π Tutorials
2021
Introduction to score-based generative models and their connection to diffusion models.
2021
Comprehensive introduction to diffusion models with clear explanations and intuitive visualizations.
2022
Unifies VAEs, hierarchical VAEs, and diffusion models under a single framework.
2024
Comprehensive guide to flow matching with code examples and applications.
π Key Papers
2015
The original paper introducing diffusion probabilistic models for generative modeling.
2020
Landmark paper that made diffusion models practical and effective for high-quality image generation.
2021
Unifies score-based models and diffusion models using SDEs, enabling new sampling methods.
arXiv β’ Project Page
2022
Introduces flow matching as a simulation-free approach to training continuous normalizing flows.
2022
Introduces rectified flow to learn straight trajectories between distributions for fast sampling.
2022
General framework for building normalizing flows via stochastic interpolation between distributions.
2021
OG fast sampling algorithm paper for diffusion model.
π Advanced Papers
2022
Systematic analysis of design choices in diffusion models with improved sampling.
2022
Fast high-order ODE solver for diffusion models.
2022
Improved solver with guided sampling support.
2021
Improved DDPM with learned variance and cosine noise schedule.
2022
Introduces v-prediction parameterization and progressive distillation.
2022
Enables conditional generation without training a separate classifier.
2023
New family of models that enable single-step generation while maintaining sample quality.
2021
Introduces latent diffusion models, the foundation of Stable Diffusion.
Compute Resources
Access to GPU compute resources is essential for training your generative models. Below are the available platforms with tutorials and guides to help you get started.
Modal
AvailableServerless cloud computing platform for running GPU workloads
AWS (Amazon Web Services)
Coming SoonCloud computing platform with EC2 GPU instances
SLURM Clusters
AvailableTutorials for navigating slurm-based clusters (like the ones we use at CMU)
Additional Resources
- Discord: Communication server for discussions, questions and collaboration
- Office Hours: In-person and virtual communication directly with the instructor, check the home page and the schedule page for the time and locations.
Note: This list will be updated throughout the course. Additional readings specific to each lecture can be found on the schedule page.