Derivative-Informed Neural Operator: An Efficient Framework for High-Dimensional Parametric Derivative Learning

06/21/2022
by   Thomas O'Leary-Roseberry, et al.
0

Neural operators have gained significant attention recently due to their ability to approximate high-dimensional parametric maps between function spaces. At present, only parametric function approximation has been addressed in the neural operator literature. In this work we investigate incorporating parametric derivative information in neural operator training; this information can improve function approximations, additionally it can be used to improve the approximation of the derivative with respect to the parameter, which is often the key to scalable solution of high-dimensional outer-loop problems (e.g. Bayesian inverse problems). Parametric Jacobian information is formally intractable to incorporate due to its high-dimensionality, to address this concern we propose strategies based on reduced SVD, randomized sketching and the use of reduced basis surrogates. All of these strategies only require only O(r) Jacobian actions to construct sample Jacobian data, and allow us to reduce the linear algebra and memory costs associated with the Jacobian training from the product of the input and output dimensions down to O(r^2), where r is the dimensionality associated with the dimension reduction technique. Numerical results for parametric PDE problems demonstrate that the addition of derivative information to the training problem can significantly improve the parametric map approximation, particularly given few data. When Jacobian actions are inexpensive compared to the parametric map, this information can be economically substituted for parametric map data. Additionally we show that Jacobian error approximations improve significantly with the introduction of Jacobian training data. This result opens the door to the use of derivative-informed neural operators (DINOs) in outer-loop algorithms where they can amortize the additional training data cost via repeated evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2023

Efficient PDE-Constrained optimization under high-dimensional uncertainty using derivative-informed neural operators

We propose a novel machine learning framework for solving optimization p...
research
12/14/2021

Adaptive Projected Residual Networks for Learning Parametric Maps from Sparse Data

We present a parsimonious surrogate framework for learning high dimensio...
research
11/30/2020

Derivative-Informed Projected Neural Networks for High-Dimensional Parametric Maps Governed by PDEs

Many-query problems, arising from uncertainty quantification, Bayesian i...
research
09/06/2022

Semi-supervised Invertible DeepONets for Bayesian Inverse Problems

Deep Operator Networks (DeepONets) offer a powerful, data-driven tool fo...
research
01/20/2022

Derivative-informed projected neural network for large-scale Bayesian optimal experimental design

We address the solution of large-scale Bayesian optimal experimental des...
research
08/18/2023

A hybrid Decoder-DeepONet operator regression framework for unaligned observation data

Deep neural operators (DNOs) have been utilized to approximate nonlinear...
research
01/13/2017

Multivariate predictions of local reduced-order-model errors and dimensions

This paper introduces multivariate input-output models to predict the er...

Please sign up or login with your details

Forgot password? Click here to reset