SVD Perspectives for Augmenting DeepONet Flexibility and Interpretability

04/27/2022
by   Simone Venturi, et al.
7

Deep operator networks (DeepONets) are powerful architectures for fast and accurate emulation of complex dynamics. As their remarkable generalization capabilities are primarily enabled by their projection-based attribute, we investigate connections with low-rank techniques derived from the singular value decomposition (SVD). We demonstrate that some of the concepts behind proper orthogonal decomposition (POD)-neural networks can improve DeepONet's design and training phases. These ideas lead us to a methodology extension that we name SVD-DeepONet. Moreover, through multiple SVD analyses, we find that DeepONet inherits from its projection-based attribute strong inefficiencies in representing dynamics characterized by symmetries. Inspired by the work on shifted-POD, we develop flexDeepONet, an architecture enhancement that relies on a pre-transformation network for generating a moving reference frame and isolating the rigid components of the dynamics. In this way, the physics can be represented on a latent space free from rotations, translations, and stretches, and an accurate projection can be performed to a low-dimensional basis. In addition to flexibility and interpretability, the proposed perspectives increase DeepONet's generalization capabilities and computational efficiencies. For instance, we show flexDeepONet can accurately surrogate the dynamics of 19 variables in a combustion chemistry application by relying on 95 trainable parameters than the ones of the vanilla architecture. We argue that DeepONet and SVD-based methods can reciprocally benefit from each other. In particular, the flexibility of the former in leveraging multiple data sources and multifidelity knowledge in the form of both unstructured data and physics-informed constraints has the potential to greatly extend the applicability of methodologies such as POD and PCA.

READ FULL TEXT

page 7

page 15

page 16

page 18

page 19

page 28

page 32

page 39

research
11/28/2018

SVD-PHAT: A Fast Sound Source Localization Method

This paper introduces a new localization method called SVD-PHAT. The SVD...
research
08/15/2020

An autoencoder-based reduced-order model for eigenvalue problems with application to neutron diffusion

Using an autoencoder for dimensionality reduction, this paper presents a...
research
12/24/2019

Singular Value Decomposition in Sobolev Spaces: Part II

Under certain conditions, an element of a tensor product space can be id...
research
03/19/2015

Reduced Basis Decomposition: a Certified and Fast Lossy Data Compression Algorithm

Dimension reduction is often needed in the area of data mining. The goal...
research
01/25/2022

A conservative low rank tensor method for the Vlasov dynamics

In this paper, we propose a conservative low rank tensor method to appro...
research
01/18/2023

Emergence of the SVD as an interpretable factorization in deep learning for inverse problems

We demonstrate the emergence of weight matrix singular value decompositi...
research
08/15/2023

Ternary Singular Value Decomposition as a Better Parameterized Form in Linear Mapping

We present a simple yet novel parameterized form of linear mapping to ac...

Please sign up or login with your details

Forgot password? Click here to reset