
Hidden Fluid Mechanics: A NavierStokes Informed Deep Learning Framework for Assimilating Flow Visualization Data
We present hidden fluid mechanics (HFM), a physics informed deep learning framework capable of encoding an important class of physical laws governing fluid motions, namely the NavierStokes equations. In particular, we seek to leverage the underlying conservation laws (i.e., for mass, momentum, and energy) to infer hidden quantities of interest such as velocity and pressure fields merely from spatiotemporal visualizations of a passive scaler (e.g., dye or smoke), transported in arbitrarily complex domains (e.g., in human arteries or brain aneurysms). Our approach towards solving the aforementioned data assimilation problem is unique as we design an algorithm that is agnostic to the geometry or the initial and boundary conditions. This makes HFM highly flexible in choosing the spatiotemporal domain of interest for data acquisition as well as subsequent training and predictions. Consequently, the predictions made by HFM are among those cases where a pure machine learning strategy or a mere scientific computing approach simply cannot reproduce. The proposed algorithm achieves accurate predictions of the pressure and velocity fields in both two and three dimensional flows for several benchmark problems motivated by realworld applications. Our results demonstrate that this relatively simple methodology can be used in physical and biomedical problems to extract valuable quantitative information (e.g., lift and drag forces or wall shear stresses in arteries) for which direct measurements may not be possible.
08/13/2018 ∙ by Maziar Raissi, et al. ∙ 2 ∙ shareread it

Hidden Physics Models: Machine Learning of Nonlinear Partial Differential Equations
While there is currently a lot of enthusiasm about "big data", useful data is usually "small" and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially dataefficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from highdimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or datadriven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the NavierStokes, Schrödinger, KuramotoSivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the longstanding developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.
08/02/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Parametric Gaussian Process Regression for Big Data
This work introduces the concept of parametric Gaussian processes (PGPs), which is built upon the seemingly selfcontradictory idea of making Gaussian processes parametric. Parametric Gaussian processes, by construction, are designed to operate in "big data" regimes where one is interested in quantifying the uncertainty associated with noisy data. The proposed methodology circumvents the wellestablished need for stochastic variational inference, a scalable algorithm for approximating posterior distributions. The effectiveness of the proposed approach is demonstrated using an illustrative example with simulated data and a benchmark dataset in the airline industry with approximately 6 million records.
04/11/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Machine Learning of Linear Differential Equations using Gaussian Processes
This work leverages recent advances in probabilistic machine learning to discover conservation laws expressed by parametric linear equations. Such equations involve, but are not limited to, ordinary and partial differential, integrodifferential, and fractional order operators. Here, Gaussian process priors are modified according to the particular form of such operators and are employed to infer parameters of the linear equations from scarce and possibly noisy observations. Such observations may come from experiments or "blackbox" computer simulations.
01/10/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Deep Multifidelity Gaussian Processes
We develop a novel multifidelity framework that goes far beyond the classical AR(1) Cokriging scheme of Kennedy and O'Hagan (2000). Our method can handle general discontinuous crosscorrelations among systems with different levels of fidelity. A combination of multifidelity Gaussian Processes (AR(1) Cokriging) and deep neural networks enables us to construct a method that is immune to discontinuities. We demonstrate the effectiveness of the new technology using standard benchmark problems designed to resemble the outputs of complicated high and lowfidelity codes.
04/26/2016 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Physics Informed Deep Learning (Part II): Datadriven Discovery of Nonlinear Partial Differential Equations
We introduce physics informed neural networks  neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations. In this second part of our twopart treatise, we focus on the problem of datadriven discovery of partial differential equations. Depending on whether the available data is scattered in spacetime or arranged in fixed temporal snapshots, we introduce two main classes of algorithms, namely continuous time and discrete time models. The effectiveness of our approach is demonstrated using a wide range of benchmark problems in mathematical physics, including conservation laws, incompressible fluid flow, and the propagation of nonlinear shallowwater waves.
11/28/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Multistep Neural Networks for Datadriven Discovery of Nonlinear Dynamical Systems
The process of transforming observed data into predictive mathematical models of the physical world has always been paramount in science and engineering. Although data is currently being collected at an everincreasing pace, devising meaningful models out of such observations in an automated fashion still remains an open problem. In this work, we put forth a machine learning approach for identifying nonlinear dynamical systems from data. Specifically, we blend classical tools from numerical analysis, namely the multistep timestepping schemes, with powerful nonlinear function approximators, namely deep neural networks, to distill the mechanisms that govern the evolution of a given dataset. We test the effectiveness of our approach for several benchmark problems involving the identification of complex, nonlinear and chaotic dynamics, and we demonstrate how this allows us to accurately learn the dynamics, forecast future states, and identify basins of attraction. In particular, we study the Lorenz system, the fluid flow behind a cylinder, the Hopf bifurcation, and the Glycoltic oscillator model as an example of complicated nonlinear dynamics typical of biological systems.
01/04/2018 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Numerical Gaussian Processes for Timedependent and Nonlinear Partial Differential Equations
We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of timedependent partial differential equations. Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on blackbox initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to timedependent partial differential equations. Our method circumvents the need for spatial discretization of the differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and dataefficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear timedependent operators. In all examples, we are able to recover accurate approximations of the latent solutions, and consistently propagate uncertainty, even in cases involving very long time integration.
03/29/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

ForwardBackward Stochastic Neural Networks: Deep Learning of Highdimensional Partial Differential Equations
Classical numerical methods for solving partial differential equations suffer from the curse dimensionality mainly due to their reliance on meticulously generated spatiotemporal grids. Inspired by modern deep learning based techniques for solving forward and inverse problems associated with partial differential equations, we circumvent the tyranny of numerical discretization by devising an algorithm that is scalable to highdimensions. In particular, we approximate the unknown solution by a deep neural network which essentially enables us to benefit from the merits of automatic differentiation. To train the aforementioned neural network we leverage the wellknown connection between highdimensional partial differential equations and forwardbackward stochastic differential equations. In fact, independent realizations of a standard Brownian motion will act as training data. We test the effectiveness of our approach for a couple of benchmark problems spanning a number of scientific domains including BlackScholesBarenblatt and HamiltonJacobiBellman equations, both in 100dimensions.
04/19/2018 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it

Machine Learning of SpaceFractional Differential Equations
Datadriven discovery of "hidden physics"  i.e., machine learning of differential equation models underlying observed data  has recently been approached by embedding the discovery problem into a Gaussian Process regression of spatial data, treating and discovering unknown equation parameters as hyperparameters of a modified "physics informed" Gaussian Process kernel. This kernel includes the parametrized differential operators applied to a prior covariance kernel. We extend this framework to linear spacefractional differential equations. The methodology is compatible with a wide variety of fractional operators in R^d and stationary covariance kernels, including the Matern class, and can optimize the Matern parameter during training. We provide a userfriendly and feasible way to perform fractional derivatives of kernels, via a unified set of ddimensional Fourier integral formulas amenable to generalized GaussLaguerre quadrature. The implementation of fractional derivatives has several benefits. First, it allows for discovering fractionalorder PDEs for systems characterized by heavy tails or anomalous diffusion, bypassing the analytical difficulty of fractional calculus. Data sets exhibiting such features are of increasing prevalence in physical and financial domains. Second, a single fractionalorder archetype allows for a derivative of arbitrary order to be learned, with the order itself being a parameter in the regression. This is advantageous even when used for discovering integerorder equations; the user is not required to assume a "dictionary" of derivatives of various orders, and directly controls the parsimony of the models being discovered. We illustrate on several examples, including fractionalorder interpolation of advectiondiffusion and modeling relative stock performance in the S&P 500 with alphastable motion via a fractional diffusion equation.
08/02/2018 ∙ by Mamikon Gulian, et al. ∙ 0 ∙ shareread it

Physics Informed Deep Learning (Part I): Datadriven Solutions of Nonlinear Partial Differential Equations
We introduce physics informed neural networks  neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations. In this two part treatise, we present our developments in the context of solving two main classes of problems: datadriven solution and datadriven discovery of partial differential equations. Depending on the nature and arrangement of the available data, we devise two distinct classes of algorithms, namely continuous time and discrete time models. The resulting neural networks form a new class of dataefficient universal function approximators that naturally encode any underlying physical laws as prior information. In this first part, we demonstrate how these networks can be used to infer solutions to partial differential equations, and obtain physicsinformed surrogate models that are fully differentiable with respect to all input coordinates and free parameters.
11/28/2017 ∙ by Maziar Raissi, et al. ∙ 0 ∙ shareread it
Maziar Raissi
is this you? claim profile