Machine learning in mechanics

Lecture notes

  1. Introduction to ML [video lecture1][video lecture2][lecture notes]

  2. Review of computational Bayesian statistics [video lecture1][video lecture2][lecture notes]

  3. Review of computational Bayesian statistics (continued) [video lecture][lecture notes]

  4. Review of computational Bayesian statistics (continued) [video lecture1][video lecture2][lecture notes]

  5. Review of computational Bayesian statistics (conclusion) [video lecture1][video lecture2][lecture notes]

  6. Generative Bayesian models for discrete data [video lecture][lecture notes]

  7. Generative Bayesian models for discrete data (continued) [video lecture1][video lecture2][lecture notes]

  8. Generative Bayesian models for discrete data (continued) [video lecture1][video lecture2][lecture notes]

  9. Bayesian model selection [video lecture][lecture notes]

  10. Priors, Hierarchical Priors, Empirical Bayes [video lecture][lecture notes]

  11. Priors, Hierarchical Priors, Empirical Bayes (continued) [video lecture][lecture notes]

  12. Priors, Hierarchical Priors, Empirical Bayes (conclusion) [video lecture][lecture notes]

  13. Introduction to linear regression models [video lecture][lecture notes]

  14. Introduction to linear regression models (continued) [video lecture1][video lecture2][lecture notes]

  15. Bayesian linear regression models [video lecture1][video lecture2][lecture notes]

  16. Linear models for classification [video lecture][lecture notes]

  17. Probabilistic generative classifiers and introduction to discriminative models [video lecture][lecture notes]

  18. Logistic regression and Probit regression [video lecture][lecture notes]

  19. Expectation maximization, Gaussian mixture model and clustering methods [video lecture][lecture notes]

  20. Expectation maximization (conclusion) [video lecture][lecture notes]

  21. Continuous latent variables [video lecture][lecture notes]

  22. Continuous latent variables (continued) [video lecture][lecture notes]

  23. Kernel methods [video lecture1][video lecture2][lecture notes]

  24. Sparse Kernel Machines (RVM) [video lecture][lecture notes]

  25. Sparse Kernel Machines (RVM) - conclusion [video lecture][lecture notes]

  26. Support vector machine [video lecture1][video lecture2][lecture notes]

  27. Support vector machine (continued) [video lecture][lecture notes]

  28. Gaussian Process [video lecture][lecture notes]

  29. Gaussian Process (Continued) [video lecture][lecture notes]

  30. Neural Network [video lecture][lecture notes]

  31. Neural Network (continued) [video lecture][lecture notes]

  32. Neural network [video lecture][lecture notes]

  33. Way ahead - deep GP and deep learning [video lecture][lecture notes]

Syllabus

  • ML in science and engineering.

  • Review of computational Bayesian statistics (priors, likelihoods, posteriors, inference, maximum likelihood and Bayesian learning).

  • Generative model for discrete data – Bayesian concept learning, Beta-Binomial model, Dirichlet-Multinomial model, Naïve Bayes Classifier.

  • Linear regression model – MLE estimation, robust linear regression, ridge regression, Bayesian linear regression

  • Logistic regression – model specification and fitting, Bayesian logistic regression, Online learning, Generative and discriminative classifiers

  • Unsupervised learning – k-means clustering, dimensionality reduction, proper orthogonal decomposition.

  • Kernel methods – the kernel trick, support vector machine, relevance vector machine.

  • Gaussian processes – vanilla Gaussian process, sparse gaussian processes, GPs for regression, GPs for classification

  • Neural networks – feed-forward neural network, activation functions, neural network for regression, neural network for classification.

References

Lecture notes and references will be provided on the course web site. The following books are recommended:

Homework

  • Homework 1: Review of Bayesian Statistics [HW1][Solution][Code]

  • Homework 2: Bayesian analysis for discrete data, Priors, Posterior and Posterior Predictive, Bayesian model selection [HW2][Solution][Code]

  • Homework 3: Linear Regression, Robust Regression and Bayesian Linear Regression [HW3][Solution][Code]

  • Homework 4: Expectation maximization, Classification, Latent variable model [HW4][Solution][Code]

Projects

  • Details of the project to be submitted as part of this course [video link]

Course info

Credit: 3 Units (2-0-2)

Lectures and practicals/tutorial: MWThF: 7:00 pm - 8:00 pm

Instructor: Dr. Souvik Chakraborty, Block IV, Room 342-C, souvik@am.iitd.ac.in

Teaching Assistants: Navaneeth. N., Kamal Krishna,

Course Objective: The objective of this course offered by the Department of Applied Mechanics is to introduce supervised and unsupervised machine learning algorithms with the emphasis on application of machine learning in mechanics. Of particular interest are linear and logistic regression models, Mixture models and EM algorithm, Sparse linear models, Latent variable models, Kernel methods, Variational methods, Gaussian process models and neural networks. The course emphasizes on mathematical and statistical learning of these problems. The course is particularly designed for the UG students in Engineering.

Intended audience: UG students. PG and PhD students may also seat through this course.