owned this note
owned this note
Published
Linked with GitHub
# Diffusion Models: From Theory to Practice (6.S982): Spring '25, MIT
**Prerequisites 📚:** Machine learning (6.7900 or similar), probability (6.3700, 18.600 or similar), linear algebra (18.06, 6.C06[J] or similar), and calculus (18.02 or similar).
**Meeting time 🕑:** Tuesdays, 1-4 p.m.
**Location 📍:** E25-111
**Satisfies**: II, AAGS, Concentration subject in AI
**Instructors 🧑🏫:** [Costis Daskalakis](http://people.csail.mit.edu/costis/) and [Giannis Daras](https://giannisdaras.github.io/)
**Teaching Assistant 🎓**: [Vardis Kandiros](https://vardiskandiros.com/)
**Office Hours 🕔:** Tuesdays 5-7 pm (after class) or by appointment.
**Description 📖:** Deep generative models have found a plethora of applications in Machine Learning, and various other scientific and applied fields, where they are used for sampling complex, high-dimensional distributions and leveraged in downstream analyses involving such distributions. This course focuses on the foundations, applications and frontier challenges of diffusion-based generative models, which over the recent years have become the prominent approach to generative modeling across a wide range of data modalities and form the backbone of industry-scale systems like AlphaFold 3, DALL-E, and Stable Diffusion. Topics include mathematical aspects of diffusion-based models (including forward and inverse diffusion processes, Fokker-Planck equations, computational and statistical complexity aspects of score estimation), the use of diffusion models in downstream analyses tasks (such as inverse problems), extensions of diffusion models (including rectified flows, stochastic interpolants, and Schrödinger bridges), and frontier challenges motivated by practical considerations (including consistency models, guidance, training with noisy data).
**Syllabus and slides 📒**:
**Lecture 1**: Introduction to generative models and their applications (GANs, VAEs, Flows, Diffusion Models, and Inverse Problems): [slides](https://drive.google.com/file/d/1bRwlXINE16v9OTljVRg5fG5Ii3Ft_VU7/view?usp=sharing).
**Lecture 2**: Deep dive in Diffusion Models (definition of the forward process, Itô integral, Itô formula, FP equation, reversibility, deterministic samplers, Tweedie's formula, Denoising Score Matching): [slides part I](https://docs.google.com/presentation/d/1lJ3OSYE4kwsKhqUvO1PghpYqpk6SCR7n/edit?usp=sharing&ouid=114676647227448618979&rtpof=true&sd=true), [slides part II](https://drive.google.com/file/d/18thXsM2uQIsp6R1eTiAsaqXQEd2M3118/view?usp=sharing).
**Lecture 3**: Diffusion models discretization error analysis: [slides](https://drive.google.com/file/d/1H2P9fcpoRbEh_KHotPt4ccOJ5oRPLaFl/view?usp=sharing).
**Lecture 4**: Part I: Learning diffusion models from corrupted data: [slides](https://drive.google.com/file/d/1W4-Yds6pnaz1GEwBPa4cJFCAE91drvfX/view?usp=sharing), Part II: Likelihoods and Latent Diffusion: [slides](https://drive.google.com/file/d/18PNXn-IX3iLJm8iTGk0vGUX4-frxPqVN/view?usp=sharing).
**Lecture 5**: Flow Matching: [slides](https://drive.google.com/file/d/1ZPAgDgSl54Kv3fGNbsD1R8x12YVB1hgp/view?usp=sharing).
**Lecture 6**: Diffusion models and inverse problems: [notes](https://drive.google.com/file/d/17WhCbN6sbRCo4Y3_wROtuI84rdnlGH3u/view?usp=sharing).
**Lecture 7**: Schrödinger bridges: [slides](https://drive.google.com/file/d/1OVYTWJbf4KFeZW12q_JxTEYKg5MVPPtM/view?usp=sharing).
**Grading 📊**: 50% group project, 25% paper presentation, 25% quizzes.
**Contact 📧**: Questions about the class? Send an email at costis[at]mit[dot]edu or gdaras[at]mit[dot]edu.