Course Resources

This page contains curated resources to support your learning throughout the course. Resources are organized into monographs, tutorials, and research papers. This list will be updated throughout the class.


πŸ“š Books

Chieh-Hsin Lai, Yang Song, Dongjun Kim, Yuki Mitsufuji, Stefano Ermon
2025
Comprehensive monograph covering diffusion models, flow matching, and transport-based generative modeling from first principles.

πŸŽ“ Courses

Stefano Ermon, Aditya Grover
2023
Stanford course on generative models including VAEs, GANs, EBMs, normalizing flows, diffusion models, and autoregressive models.
Matt Gormley, Yuanzhi Li, Henry Chai, Pat Virtue, Aran Nayebi
2025
CMU course on generative models including LLMs, GANs, and diffusion models.
Beidi Chen, Xun Huang
2025
CMU course on generative models including LLMs, VAEs, and diffusion models.
Peter Holderrieth, Ezra Erives
2025
MIT class on diffusion and flow matching from a flow-based theoretical perspective.
Andrej Risteski, Albert Gu
2025
CMU course that focuses on probabilistic modeling (including some deep generative models from a more theoretical perspective).
Stefano Ermon
2024
Stanford course that focuses on probabilistic modeling.

πŸ“ Tutorials

Yang Song
2021
Introduction to score-based generative models and their connection to diffusion models.
Lilian Weng
2021
Comprehensive introduction to diffusion models with clear explanations and intuitive visualizations.
Calvin Luo
2022
Unifies VAEs, hierarchical VAEs, and diffusion models under a single framework.
Yaron Lipman, Marton Havasi, Peter Holderrieth, Neta Shaul, Matt Le, Brian Karrer, Ricky T. Q. Chen, David Lopez-Paz, Heli Ben-Hamu, Itai Gat
2024
Comprehensive guide to flow matching with code examples and applications.

πŸ“„ Key Papers

Jascha Sohl-Dickstein, Eric A. Weiss, Niru Maheswaranathan, Surya Ganguli
2015
The original paper introducing diffusion probabilistic models for generative modeling.
Jonathan Ho, Ajay Jain, Pieter Abbeel
2020
Landmark paper that made diffusion models practical and effective for high-quality image generation.
Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
2021
Unifies score-based models and diffusion models using SDEs, enabling new sampling methods.
Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le
2022
Introduces flow matching as a simulation-free approach to training continuous normalizing flows.
Xingchao Liu, Chengyue Gong, Qiang Liu
2022
Introduces rectified flow to learn straight trajectories between distributions for fast sampling.
Michael S. Albergo, Eric Vanden-Eijnden
2022
General framework for building normalizing flows via stochastic interpolation between distributions.
Jiaming Song, Chenlin Meng, Stefano Ermon
2021
OG fast sampling algorithm paper for diffusion model.

πŸš€ Advanced Papers

Tero Karras, Miika Aittala, Timo Aila, Samuli Laine
2022
Systematic analysis of design choices in diffusion models with improved sampling.
Cheng Lu, Yuhao Zhou, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu
2022
Fast high-order ODE solver for diffusion models.
Cheng Lu, Yuhao Zhou, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu
2022
Improved solver with guided sampling support.
Alex Nichol, Prafulla Dhariwal
2021
Improved DDPM with learned variance and cosine noise schedule.
Diederik P. Kingma, Tim Salimans, Ben Poole, Jonathan Ho
2021
Continuous-time diffusion with learned noise schedule.
Tim Salimans, Jonathan Ho
2022
Introduces v-prediction parameterization and progressive distillation.
Jonathan Ho, Tim Salimans
2022
Enables conditional generation without training a separate classifier.
Yang Song, Prafulla Dhariwal, Mark Chen, Ilya Sutskever
2023
New family of models that enable single-step generation while maintaining sample quality.
Robin Rombach, Andreas Blattmann, Dominik Lorenz, Patrick Esser, BjΓΆrn Ommer
2021
Introduces latent diffusion models, the foundation of Stable Diffusion.

Compute Resources

Access to GPU compute resources is essential for training your generative models. Below are the available platforms with tutorials and guides to help you get started.

Modal

Available
Serverless cloud computing platform for running GPU workloads

AWS (Amazon Web Services)

Coming Soon
Cloud computing platform with EC2 GPU instances

SLURM Clusters

Available
Tutorials for navigating slurm-based clusters (like the ones we use at CMU)

Additional Resources

  • Discord: Communication server for discussions, questions and collaboration
  • Office Hours: In-person and virtual communication directly with the instructor, check the home page and the schedule page for the time and locations.

Note: This list will be updated throughout the course. Additional readings specific to each lecture can be found on the schedule page.