Proficiency in Python: Coding assignments will be in Python.
Calculus and Linear Algebra: You should be comfortable taking (multivariable) derivatives and understand matrix/vector notation and operations.
Probability Theory: You should be familiar with basic probability distributions (Continuous, Gaussian, Bernoulli, etc.) and be able to define concepts for both continuous and discrete random variables: Expectation, independence, probability distribution functions, and cumulative distribution functions.
Note: Familiarity with PyTorch, along with prior knowledge of basic machine learning concepts, neural networks, optimization, and backpropagation, is recommended (desirable but not essential).
Introduction to the course - Modalities, timing, grading scheme, syllabus [lecture note]
Introduction to Generative Models and Review of Probability Theory (Part 1) [Lecture note][Lecture video]
Review of Probability Theory (Part 2) [Lecture note][Lecture video]
Review of Probability Theory (Part 3) [Lecture note][Lecture video]
Autoregressive Generative Models (Part 1) [Lecture note][Lecture video]
Autoregressive Generative Models (Part 2) [Lecture note][Lecture video]
Latent Variable Models and VAEs (Part 1) [Lecture note][Lecture video]
Variational Autoencoders (Part 2) [Lecture note][Lecture video]
Variational Autoencoders (Part 3) [Lecture note][Lecture video]
Flow-based Models (Part 1) [Lecture note][Lecture video]
Introduction to Statistical Computing and Probability and Statistics
Transformation of distributions and introduction to Generative Modeling and Variational Divergence Minimization
Maximum Likelihood Learning and Autoregressive generative models
Latent Variable Models (LVMs) and Variational Autoencoders (VAE)
Generative Adversarial Learning
Invertible Nerural Network and Normalizing Flow, Glow Model and Real NVP
Energy-based models and Score-based models
Denoising Diffusion Probabilistic Model and Denoising Diffusion Implicit Model
Flow Matching
Lecture notes and references will be provided on the course web site. The lecture material are recommended:
Generative Modeling by Estimating Gradients of the Data Distribution Yang Song. Blog post on score-based generative models, May 2021.
How to Train Your Energy-Based Models. Yang Song and Diederik P. Kingma. February 2021.
Tutorial on Deep Generative Models. Aditya Grover and Stefano Ermon. International Joint Conference on Artificial Intelligence, July 2018.
Tutorial on Generative Adversarial Networks. Computer Vision and Pattern Recognition, June 2018.
Tutorial on Deep Generative Models. Shakir Mohamed and Danilo Rezende. Uncertainty in Artificial Intelligence, July 2017.
Tutorial on Generative Adversarial Networks. Ian Goodfellow. Neural Information Processing Systems, December 2016.
Learning deep generative models. Ruslan Salakhutdinov. Annual Review of Statistics and Its Application, April 2015.
Homework-1 [homework]
Homework-2 [homework]
Practical-0: Introduction to Neural Networks [Practical0]
Practical-1: Sampling [Practical1]
Practical-2:Autoregressive Generative Model [Practical2]
Each student will have to complete a term project as part of this course
Credit: 4 Units (3-0-2)
Timing: Lecture - TBD, Practical - TBD
Venue: TBD
Instructor: Souvik Chakraborty
Teaching Assistants: Sawan Kumar, Vagish Kumar, Subhankar Sarkar
Course Objective: In this course, the students will be introduced to the fundamentals of Deep Generative Models. Students are expected to learn different Generartive Models from scratch. The course will emphasize on the mathematical learning of these concepts along with applications. The course is particularly designed for PG, Ph.D., and senior UG students.
Intended audience: Senior UG, PG, and Ph.D. students