# Probabilistic Machine learning For mechanics

## Lecture notes

Introduction to the course [lecture notes][lecture video]

Review of Bayesian Statistics - Part 1 [Lecture Notes][lecture video]

Review of Bayesian Statistics (conclusion) and Bayesian inference [Lecture Notes][lecture video]

Bayesian inference, MLE, MAP, Posterior Predictive [Lecture Notes][lecture video]

Prior modeling, Conjugate Prior, Exponential Family [Lecture Notes][lecture video]

Non-information Prior, Hierarchical Bayes, Empirical Bayes [Lecture Notes][lecture video]

EB (Example) and Bayesian linear regression - Part 1 [Lecture Notes][lecture video]

Bayesian linear regression - Part 2 [Lecture Notes][Lecture video]

Bayesian linear regression - Part 3 [Lecture Notes][lecture video]

Bayesian inference using sampling methods - Part 1 [Lecture Notes][lecture video]

Bayesian inference using sampling methods - Part 2 [Lecture Notes][lecture video]

Bayesian inference using sampling methods - Part 3 [Lecture Notes][lecture video]

Bayesian inference using sampling methods - Part 4 [Lecture Notes][lecture video]

Bayesian inference using sampling methods - Part 5

Bayesian inference using sampling methods - Part 6

Approximate methods for Bayesian inference - Part 1

Approximate methods for Bayesian inference - Part 2

Gaussian process - Part 1

Gaussian process - Part 2

Sparse Gaussian Process - Part 1

Sparse Gaussian Process - Part 2

Factor analysis, Probabilistic PCA, Duel Probabilistic PCA, and GP-LVM

Deep Gaussian Process

Invertible neural network - Part 1

Invertible neural network - Part 2

Diffusion Model - Part 1

Diffusion Model - Part 2

Review of the course and way Ahead

## Syllabus

Introduction to Statistical Computing and Probability and Statistics

Likelihood, Prior, Posterior, Posterior predictive distribution, Plug-in Approximation

Bayesian linear regression

Introduction to Monte Carlo Methods, Sampling from Discrete and Continuum Distributions, Reverse Sampling

Importance sampling, Gibbs sampling, MCMC, Metropolis Hasting algorithm

Variational approach and approximate inference

Sparse linear regression

Gaussian process

Latent variable model, probabilistic PCA, GP-LVM

Some advanced topics in probabilistic ML: Flow-based model, Diffusion model

## References

Lecture notes and references will be provided on the course web site. The following books are recommended:

Bishop, C.M. Pattern recognition and Machine learning, Springer, 2007.

Murphy, K.P. “Machine learning: A Probabilistic Perspective”, MIT press, 2022.

Rasmussen, Carl Edward. Gaussian processes in machine learning, In Summer school on machine learning, pp. 63-71. Springer, Berlin, Heidelberg, 2003

## Homework

Homework-1: Bayesian linear regression, Sampling method [homework]

Homework-2: Gaussian Process, Approximate methods for Bayesian inference [homework]

Homework-3: Gaussian Process - advanced [homework]

Homework 4: Unsupervised learning and generative modeling [homework]

## Practical

Practical-0: Introduction to statistical computing [QP][solution template][solution]

Practical-1: Priors, Bayesian linear regression [QP]

Practical-2: Sampling method in Bayesian linear regression [QP]

Practical-3: Approximate inference in Bayesian linear regression [QP]

Practical-4: Equation discovery using ML [QP]

## Projects

Each student will have to complete a term project as part of this course

## Course info

Credit: 4 Units (3-0-2)

Timing: Lecture - Monday and Thursday (9:30 am - 11:00 am), Practical - Wednesday (3:00 pm - 5:00 pm)

Venue: LH513 (lecture), IV-LT2 (Practical)

Instructor: Dr. Souvik Chakraborty

Teaching Assistants: Tapas Tripura, Sawan Kumar, Subhankar Sarkar

Course Objective: In this course, the students will be introduced to the fundamentals of probabilistic machine learning and its application in computational mechanics. Students are expected to learn different probabilistic machine learning algorithms and applications in solving mechanics problems The course will emphasize on the mathematical learning of these concepts along with applications. The course is particularly designed for PG, Ph.D., and senior UG students.

Intended audience: Senior UG, PG, and Ph.D. students