Scalable algorithms for physics-informed neural and graph networks

Bibliographic Details
Title: Scalable algorithms for physics-informed neural and graph networks
Authors: Shukla, Khemraj, Xu, Mengjia, Trask, Nathaniel, Karniadakis, George Em
Publication Year: 2022
Collection: Computer Science
Mathematics
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Distributed, Parallel, and Cluster Computing, Mathematics - Analysis of PDEs, Mathematics - Dynamical Systems
More Details: Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space-time domain. Such physics-informed machine learning integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems.
Comment: 26 pages, 13 figures. arXiv admin note: text overlap with arXiv:2104.10013
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2205.08332
Accession Number: edsarx.2205.08332
Database: arXiv
More Details
Description not available.