Deep learning For mechanics
Lecture notes
Introduction to ML and DL [lecture video][lecture notes]
Review of ML basics - Part 1 [lecture video][lecture notes]
Review of ML basics - Part 2 [lecture video][lecture notes]
Review of ML basics - Part 3 [lecture video][lecture notes]
Fully connected neural network [lecture video][lecture notes]
Backpropagation [lecture video][lecture notes]
Optimization in DL (Part 1) [lecture video][lecture notes]
Optimization in DL (Part 2) [lecture video][lecture notes]
Generalization in DL [lecture video][lecture notes]
Convolutional neural network (CNN) - part 1 [lecture video][lecture notes]
Convolutional neural network (CNN) - part 2 [lecture video][lecture notes]
Physics-informed deep learning - intro [lecture video][lecture notes]
Automatic differentiation [lecture video][lecture notes]
Physics-informed deep learning - Part 2 [lecture video][lecture notes]
Physics-informed deep learning - Part 3 [lecture video][lecture notes]
Physics-informed deep learning for inverse problems [lecture video][lecture notes]
Physics-informed deep learning for inverse problems - continued [lecture video][lecture notes]
Some applications of physics-informed deep learning [lecture video][lecture notes]
Operator learning [lecture video][lecture notes]
Sequence learning - Part 1 (RNN) [lecture video][lecture notes]
Sequence learning - Part 2 (Backpropagation in RNN) [lecture video][lecture notes]
Sequence learning - Part 3 (GRU, LSTM) [lecture video][lecture notes]
Sequence learning - Part 4 (Transformer) [lecture video][lecture notes]
Syllabus
A quick recap of linear algebra, probability, and numerical computations
Introduction to machine learning including linear regression, classification, and single layer perceptron.
Deep feed-forward network – learning XOR, Gradient based learning, Hidden units and network architecture, regularization, optimization in deep learning
Convolutional neural network – the convolution operator, pooling, Variants of the Basic Convolution Function, structured outputs, The Neuroscientific Basis for Convolutional Networks
Recurrent neural network, unfolding computational graph, bidirectional RNN, Deep Recurrent Networks, The Challenge of Long-Term Dependencies, The Long Short-Term Memory and Other Gated RNNs, Optimization for Long-Term Dependencies, Explicit Memory
Some advanced topics in deep learning – Attention, GAN, VAE, Capsule-net, Transformers.
From data-driven to physics informed deep learning, need for physics informed deep learning, challenges in physics informed deep learning
Physics-informed deep learning in strong form and weak form
Mixed formulation-based physics-informed deep learning
Physics-informed deep learning for time-dependent systems
Concluding remarks. Challenges and way ahead
References
Lecture notes and references will be provided on the course web site. The following books are recommended:
Goodfellow, Y. Bengio, A. Courville “Deep Learning”, Vol. 1. Cambridge: MIT press, 2016.
Bengio, Y. “Learning deep architectures for AI”. Now Publishers Inc, 2009.
Nielsen, M. A. “Neural networks and deep learning”. Vol. 2018. San Francisco, CA: Determination press, 2015.
Zhang, A., Lipton, Z. C., Li, M. and Smola, A. J. “Dive into Deep Learning”.
Murphy, K.P. “Machine learning: A Probabilistic Perspective”, MIT press, 2022.
Projects
Description video [video link]
Course info
Credit: 4 Units (3-0-2)
Lectures and practicals/tutorial: TBD
Instructor: Dr. Souvik Chakraborty and Dr. Rajdip Nayek
Teaching Assistants:
Course Objective: The objective of this course offered by the Department of Applied Mechanics is to introduce the concepts of Deep Learning (DL) algorithms to the students. The course will dive into the fundamental concepts of DL and its application in solving scientific and engineering problems. Data-driven and physics-informed deep learning algorithms will be covered in this course. Of particular interest are multi-layer perceptron, CNN, RNN, LSTM, Attention, Transformer, GAN, and VAE. The course will emphasize on the mathematical learning of these concepts along with applications. The course is particularly designed for PG, Ph.D., and senior UG students.
Intended audience: Senior UG, PG, and Ph.D. students