Progressive transfer learning for low frequency data prediction in full waveform inversion

12/20/2019
by   Wenyi Hu, et al.
0

For the purpose of effective suppression of the cycle-skipping phenomenon in full waveform inversion (FWI), we developed a Deep Neural Network (DNN) approach to predict the absent low-frequency components by exploiting the implicit relation connecting the low-frequency and high-frequency data through the subsurface geological and geophysical properties. In order to solve this challenging nonlinear regression problem, two novel strategies were proposed to design the DNN architecture and the learning workflow: 1) Dual Data Feed; 2) Progressive Transfer Learning. With the Dual Data Feed structure, both the high-frequency data and the corresponding Beat Tone data are fed into the DNN to relieve the burden of feature extraction, thus reducing the network complexity and the training cost. The second strategy, Progressive Transfer Learning, enables us to unbiasedly train the DNN using a single training dataset. Unlike most established deep learning approaches where the training datasets are fixed, within the framework of the Progressive Transfer Learning, the training dataset evolves in an iterative manner while gradually absorbing the subsurface information retrieved by the physics-based inversion module, progressively enhancing the prediction accuracy of the DNN and propelling the FWI process out of the local minima. The Progressive Transfer Learning, alternatingly updating the training velocity model and the DNN parameters in a complementary fashion toward convergence, saves us from being overwhelmed by the otherwise tremendous amount of training data, and avoids the underfitting and biased sampling issues. The numerical experiments validated that, without any a priori geological information, the low-frequency data predicted by the Progressive Transfer Learning are sufficiently accurate for an FWI engine to produce reliable subsurface velocity models free of cycle-skipping-induced artifacts.

READ FULL TEXT

page 33

page 35

page 37

page 38

page 39

page 40

page 41

research
05/25/2021

An Upper Limit of Decaying Rate with Respect to Frequency in Deep Neural Network

Deep neural network (DNN) usually learns the target function from low to...
research
01/31/2018

Fusarium Damaged Kernels Detection Using Transfer Learning on Deep Neural Network Architecture

The present work shows the application of transfer learning for a pre-tr...
research
05/22/2018

Rapid seismic domain transfer: Seismic velocity inversion and modeling using deep generative neural networks

Traditional physics-based approaches to infer sub-surface properties suc...
research
01/31/2022

Deep Learning Macroeconomics

Limited datasets and complex nonlinear relationships are among the chall...
research
09/25/2021

Deep Learning-Based Detection of the Acute Respiratory Distress Syndrome: What Are the Models Learning?

The acute respiratory distress syndrome (ARDS) is a severe form of hypox...
research
09/05/2021

FBCNN: A Deep Neural Network Architecture for Portable and Fast Brain-Computer Interfaces

Objective: To propose a novel deep neural network (DNN) architecture – t...
research
08/28/2020

A transfer learning metamodel using artificial neural networks applied to natural convection flows in enclosures

In this paper, we employed a transfer learning technique to predict the ...

Please sign up or login with your details

Forgot password? Click here to reset