Research
"Take on idea. Make that one idea your life - think of it, dream of it, live on that idea. Let the brain, muscles, nerves and every part of your body be full of that idea, and just leave every other idea alone. This is the way to success"
- Swami Vivekananda
CSCCM's research stands on three fundamental footings: Artificial Intelligence, probabilistic methods, and computational mechanics. The overarching research theme of CSCCM is "development of novel machine learning algorithms for scientific computing and computational mechanics". The methods, tools, algorithms, and framework developed are used for solving and understanding cutting-edge Multiscale, Multiphysics, and Multidisciplinary problems in applied science and engineering. Overall, CSCCM is actively working in three thrust areas: (1) Bridging Scientific Discovery and Machine Learning; (2) Scientific Machine Learning at Scale; and (3) Uncertainty Quantification and Reliability Analysis.
Thrust Area I: Bridging Scientific Discovery and Machine Learning
Discovering governing equation of dynamical and stochastic dynamical systems from data:
Our group's work on this specific topic focuses on the development of scalable algorithms for discovering governing physics from data. This is achieved by using concepts of sparse regression, symbolic regression, probability theory, and structural dynamics. Of particular interest is to discover governing physics of stochastic dynamical systems subjected to stochastic excitation. Application of the developed algorithms include "reliability analysis", "health monitoring", and "remaining-useful life prediction" of aging systems. Our motivation and long-term goal here is to work toward the development of robust autonomous agents that can explore new environments and uncover scientific principles about the same while accounting for uncertainties in the environment.
Fig. 1 A representative figure showcasing one of the physics discovery framework developed at CSCCM. This framework allows discovery of interpretable Lagrangian from sparse data. The discovered Lagrangian can be used for further learning conservation laws, governing equation, etc.
Machine learning based accelerated multi-scale design of metamaterials:
Future deep space exploration missions must address the challenge of limited resupply for repairs by innovating in materials and structures to achieve lighter, stronger, more reliable, and multifunctional designs. Digital lattice materials provide a novel solution by creating lattice solids from discrete components that act as continuous materials, offering exceptional strength and stiffness at minimal density while allowing for customization. However, we have a limited understanding of failure mechanisms, fracture toughness, and fatigue properties in these materials. More importantly, the design and discovery of new lattice material to suit a user's need is time-consuming. To this end, CSCCM focuses on the development and application of machine learning algorithms for the analysis and accelerated design of lattice materials.
Fig. 2 Schematic representation of metamaterial discovery framwork
Planning to solve physical reasoning tasks:
Learning to reason with physical skills is a hallmark of intelligence. Evidence is emerging that humans possess an intuitive physics engine involved in perceptual and goal-directed reasoning. However, enabling robots to reason in a similar fashion is still illusive as the existing methods for physical reasoning are data-hungry and struggle with the complexity and uncertainty inherent in real-world scenarios. CSCCM is actively working collaboration with Prof. Rohan Paul's group to develop next generation planner that will enable robots to solve physical reasoning tasks.Â
Fig. 3 PhyPlan: A physics-informed planner for solving physical reasoning task developed by us.
Reinforcement learning for vibration control:
Smooth and uninterrupted operation of infrastructure systems is absolutely essential for economic growth and sustainability of any community. One aspect in this regards is to develop appropriate control strategies for vibration mitigation in structural systems. In this regards, one of the focus areas of CSCCM is to develop reinforcement learning (RL) based control strategies for vibration mitigation. With RL, we aim to address the bottlenecks associated with conventional control strategies such as faster execution time in the feedback loop.
Fig. 4 Schematic representation of reinforcement learning algorithm (along with its performance) developed as CSCCM.
Thrust Area II: Scientific Machine Learning at Scale
Operator Learning:
Operator learning is perhaps the most promising developments of recent times in scientific machine learning (Sci-ML). The high level idea in operator learning is to learn mapping between function spaces; this can potentially provide better generalization as compared to conventional neural network architectures. At CSCCM, we are actively working in this area and have developed novel algorithms and architectures such as Wavelet Neural Operator and its variants, variational Bayes DeepONet, mixture density NOMAD (MD-NOMAD), etc. We are actively working on development of (a) data-driven, (b) physics-informed, and (c) physics-integrated operator learning algorithms. Application of the developed algorithms includes (a) Solid mechanics (e.g., crack propagation), (b) Fluid mechanics (e.g., flow modeling), (c) Healthcare (e.g., Tumor detection) and (d) Weather prediction to name a few.
Fig. 5 Architecture of WNO (top left) developed at CSCCM and its application in tumor detection from elastography (top right and bottom).
Foundation models in Sci-ML:
Machine learning has witnessed substantial growth, leading to the development of advanced artificial intelligence models crafted to address a wide range of real-world challenges spanning various domains, such as computer vision, natural language processing, and scientific computing. Nevertheless, the creation of custom models for each new task remains a resource-intensive undertaking, demanding considerable computational time and memory resource. We have developed the first of its kind foundation model for Sci-ML that allows learning solution operators of multiple physics simultaneously and continually (without forgetting).
Fig. 6 Foundation model for Sci-ML developed at CSCCM. This is a first of its kind model that allows learning multiple physics simultaneously and continually.
Trustworthy and uncertainty-aware Digital Twin:
To effectively analyze the structural health of the structures, systems, and components (SSCs), we are yet to fully leverage advanced artificial intelligence (AI) and Big Data analytics. We at CSCCM are working towards developing scientific tools for mitigating the major challenges related to SSC degradation and Digital Twin for health monitoring and prognostics. In particular, we are working on the development of Intelligent Digital Twins with Explainable AI (IDEA) framework to predict structural degradation and the remaining useful life of degrading SSCs, enabling hazard detection and condition monitoring.
Fig. 7 A schematic representation of digital twin from one of our earlier works.
Physics-informed learning:
One of the primary bottlenecks of data-driven approaches is unavailability of adequate training data. On the contrary, we often have access to physics model for systems in form of partial differential equations. Physics-informed learning is a relatively new paradigm where the objective is to develop training algorithms that will allow machine learning models to be trained from the known physics itself. CSCCM is actively working on physics-informed learning with some of our contributions being development of variational energy based physics-informed learning for phase field modeling of fracture, multi-fidelity physics-informed learning, and gradient-free physics-informed to name a few.
Fig. 8 Physics-informed neural network for phase-field modeling of fracture in brittle material.
Machine-learning Enhanced Scientific Simulation:
The governing equations commonly used in science and engineering are often based on assumptions and approximations. Naturally, scientific simulations performed using the known governing laws are also approximate in nature. We are working towards leveraging data-physics fusion to harmoniously combine approximate physics laws with data-based models, enhancing the accuracy and predictive capabilities of simulations. By bridging the gap between data-driven insights and fundamental physical principles, we envision that this will empower scientists and engineers to gain deeper insights into complex systems, optimize experiments, and expedite scientific discovery. The developments carried out in this area will be particularly useful in the digital twin technology.
Fig. 9 Deep Physics Corrector (DPC): One of the data-physics fusion framework proposed by us. This is particularly developed with stochastic differential equation (SDE) in mind.
Thrust Area III: Uncertainty Quantification And Reliability Analysis
Uncertainty quantification and propagation:
All physical systems have inherent associated randomness, and for satisfactory performance and informed decision making, it is essential to quantify the effect of the input uncertainty on the output. At CSCCM, we are actively working on development of efficient algorithms for uncertainty quantification and propagation. We are actively involved in development of surrogate models, sampling based approaches, and hybrid approaches for total uncertainty quantification (both epistemic and aleatoric). For example, one of our work in this regards include development of hybrid approach for model-form uncertainty quantification in dynamical systems.
Fig. 10 Schematic representation of model-form uncertainty quantification framework developed at CSCCM. This work was carried out in collaboration with Prof. Budhaditya Hazra from IIT Guwahati.
Structural reliability analysis:
Structural reliability analysis is a critical field in engineering that focuses on evaluating the safety and performance of structures under uncertain conditions. Unfortunately, solving structural reliability analysis is extremely expensive, specifically for real-life important systems where failure is rare. We are actively working on developing efficient algorithms that can potentially enable structural reliability analysis of real-life systems like nuclear reactors, burried pipeline under corrosion, etc. For example, the Locally refined hp-adaptive H-PCFE model developed can solve rare event probability problems
Fig. 11 Schematic representation of hybrid framework that combines Bayesian statistics, subset simulation, and machine learning for efficient reliability analysis of structural systems
Design under uncertainty:
The final objective in engineering is to design systems that can perform desired functionalities without failure. However, if the presence of uncertainties is ignored, the final design will be nonoptimal and can result in catastrophic failure. In design under uncertainty, the objective is to develop optimization algorithms that can design while encountering for the effect of uncertainties in the system. Broadly speaking, the design under uncertainty algorithms can be broadly classified into two categories: (a) reliability-based design optimization (RBDO) and (b) Robust design optimization (RDO). in RBDO, the objective is to optimize while satisfying a reliability constraint. On the other hand, the RDO, the objective is to minimize the propagation of uncertainty from the input to the output. CSCCM is actively working on both these aspects. For example, threshold-shift method proposed by us allows engineers to solve RBDO in an efficient manner.
Fig. 12 Schematic representation of threshold shift method developed for reliability-based design optimization.