Finite Difference Nets: A Deep Recurrent Framework for Solving Evolution PDEs

by   Cheng Chang, et al.

There has been an arising trend of adopting deep learning methods to study partial differential equations (PDEs). In this paper, we introduce a deep recurrent framework for solving time-dependent PDEs without generating large scale data sets. We provide a new perspective, that is, a different type of architecture through exploring the possible connections between traditional numerical methods (such as finite difference schemes) and deep neural networks, particularly convolutional and fully-connected neural networks. Our proposed approach will show its effectiveness and efficiency in solving PDE models with an integral form, in particular, we test on one-way wave equations and system of conservation laws.


page 1

page 2

page 3

page 4


FiniteNet: A Fully Convolutional LSTM Network Architecture for Time-Dependent Partial Differential Equations

In this work, we present a machine learning approach for reducing the er...

Solving Irregular and Data-enriched Differential Equations using Deep Neural Networks

Recent work has introduced a simple numerical method for solving partial...

Multiscale and Nonlocal Learning for PDEs using Densely Connected RNNs

Learning time-dependent partial differential equations (PDEs) that gover...

Learning Interpretable and Thermodynamically Stable Partial Differential Equations

In this work, we develop a method for learning interpretable and thermod...

GrADE: A graph based data-driven solver for time-dependent nonlinear partial differential equations

The physical world is governed by the laws of physics, often represented...

Learning optimal multigrid smoothers via neural networks

Multigrid methods are one of the most efficient techniques for solving l...

Connections between Numerical Algorithms for PDEs and Neural Networks

We investigate numerous structural connections between numerical algorit...