DeepAI AI Chat
Log In Sign Up

Stochastic Gradient Descent for Semilinear Elliptic Equations with Uncertainties

06/10/2020
by   Ting Wang, et al.
0

Randomness is ubiquitous in modern engineering. The uncertainty is often modeled as random coefficients in the differential equations that describe the underlying physics. In this work, we describe a two-step framework for numerically solving semilinear elliptic partial differential equations with random coefficients: 1) reformulate the problem as a functional minimization problem based on the direct method of calculus of variation; 2) solve the minimization problem using the stochastic gradient descent method. We provide the convergence criterion for the resulted stochastic gradient descent algorithm and discuss some useful technique to overcome the issues of ill-conditioning and large variance. The accuracy and efficiency of the algorithm are demonstrated by numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/20/2022

A note on the variation of geometric functionals

Calculus of Variation combined with Differential Geometry as tools of mo...
10/20/2021

Adaptive Gradient Descent for Optimal Control of Parabolic Equations with Random Parameters

In this paper we extend the adaptive gradient descent (AdaGrad) algorith...
09/30/2017

The Deep Ritz method: A deep learning-based numerical algorithm for solving variational problems

We propose a deep learning based method, the Deep Ritz Method, for numer...
03/24/2022

Local optimisation of Nyström samples through stochastic gradient descent

We study a relaxed version of the column-sampling problem for the Nyströ...
04/23/2022

Competitive Physics Informed Networks

Physics Informed Neural Networks (PINNs) solve partial differential equa...