Proficiency in Python: Coding assignments will be in Python.
Calculus and Linear Algebra: You should be comfortable taking (multivariable) derivatives and understand matrix/vector notation and operations.
Probability Theory: You should be familiar with basic probability distributions (Continuous, Gaussian, Bernoulli, etc.) and be able to define concepts for both continuous and discrete random variables: Expectation, independence, probability distribution functions, and cumulative distribution functions.
Note: Familiarity with PyTorch, along with prior knowledge of basic machine learning concepts, neural networks, optimization, and backpropagation, is recommended (desirable but not essential).
Introduction to the course - Modalities, timing, grading scheme, syllabus [lecture notes]
Introduction to Generative Models and Review of Probability Theory (Part 1) [Lecture notes][Lecture videos]
Review of Probability Theory (Part 2) [Lecture notes][Lecture videos]
Review of Probability Theory (Part 3) [Lecture notes][Lecture videos]
Autoregressive Generative Models (Part 1) [Lecture notes][Lecture videos]
Autoregressive Generative Models (Part 2) [Lecture notes][Lecture videos]
Latent Variable Models (Part 1) [Lecture notes][Lecture Videos]
Introduction to Statistical Computing and Probability and Statistics
Transformation of distributions and introduction to Generative Modeling and Variational Divergence Minimization
Maximum Likelihood Learning, Variational Autoencoders, and its variants
Generative Adversarial Learning and Latent Variable Models
Invertible Nerural Network and Normalizing Flow, Glow Model and Real NVP
Energy-based models and Score-based models
Denoising Diffusion Probabilistic Model and Denoising Diffusion Implicit Model
Flow Matching
Lecture notes and references will be provided on the course web site. The lecture material are recommended:
Generative Modeling by Estimating Gradients of the Data Distribution Yang Song. Blog post on score-based generative models, May 2021.
How to Train Your Energy-Based Models. Yang Song and Diederik P. Kingma. February 2021.
Tutorial on Deep Generative Models. Aditya Grover and Stefano Ermon. International Joint Conference on Artificial Intelligence, July 2018.
Tutorial on Generative Adversarial Networks. Computer Vision and Pattern Recognition, June 2018.
Tutorial on Deep Generative Models. Shakir Mohamed and Danilo Rezende. Uncertainty in Artificial Intelligence, July 2017.
Tutorial on Generative Adversarial Networks. Ian Goodfellow. Neural Information Processing Systems, December 2016.
Learning deep generative models. Ruslan Salakhutdinov. Annual Review of Statistics and Its Application, April 2015.
Homework-1 [homework]
Homework-2 [homework]
Practical-0: Introduction to Neural Networks [Practical0]
Each student will have to complete a term project as part of this course
Credit: 4 Units (3-0-2)
Timing: Lecture - TBD, Practical - TBD
Venue: TBD
Instructor: Souvik Chakraborty
Teaching Assistants: Sawan Kumar, Vagish Kumar, Subhankar Sarkar
Course Objective: In this course, the students will be introduced to the fundamentals of Deep Generative Models. Students are expected to learn different Generartive Models from scratch. The course will emphasize on the mathematical learning of these concepts along with applications. The course is particularly designed for PG, Ph.D., and senior UG students.
Intended audience: Senior UG, PG, and Ph.D. students