StorSeismic: A new paradigm in deep learning for seismic processing

04/30/2022
by   Randy Harsuko, et al.
0

Machine learned tasks on seismic data are often trained sequentially and separately, even though they utilize the same features (i.e. geometrical) of the data. We present StorSeismic, as a framework for seismic data processing, which consists of neural network pre-training and fine-tuning procedures. We, specifically, utilize a neural network as a preprocessing model to store seismic data features of a particular dataset for any downstream tasks. After pre-training, the resulting model can be utilized later, through a fine-tuning procedure, to perform tasks using limited additional training. Used often in Natural Language Processing (NLP) and lately in vision tasks, BERT (Bidirectional Encoder Representations from Transformer), a form of a Transformer model, provides an optimal platform for this framework. The attention mechanism of BERT, applied here on a sequence of traces within the shot gather, is able to capture and store key geometrical features of the seismic data. We pre-train StorSeismic on field data, along with synthetically generated ones, in the self-supervised step. Then, we use the labeled synthetic data to fine-tune the pre-trained network in a supervised fashion to perform various seismic processing tasks, like denoising, velocity estimation, first arrival picking, and NMO. Finally, the fine-tuned model is used to obtain satisfactory inference results on the field data.

READ FULL TEXT
research
07/12/2021

MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Understanding

This paper presents an attempt to employ the mask language modeling appr...
research
08/09/2023

Optimizing a Transformer-based network for a deep learning seismic processing workflow

StorSeismic is a recently introduced model based on the Transformer to a...
research
01/29/2023

Towards Vision Transformer Unrolling Fixed-Point Algorithm: a Case Study on Image Restoration

The great success of Deep Neural Networks (DNNs) has inspired the algori...
research
07/22/2019

Realistic Channel Models Pre-training

In this paper, we propose a neural-network-based realistic channel model...
research
12/01/2022

CHAPTER: Exploiting Convolutional Neural Network Adapters for Self-supervised Speech Models

Self-supervised learning (SSL) is a powerful technique for learning repr...
research
12/13/2020

MiniVLM: A Smaller and Faster Vision-Language Model

Recent vision-language (VL) studies have shown remarkable progress by le...
research
12/04/2020

RPT: Relational Pre-trained Transformer Is Almost All You Need towards Democratizing Data Preparation

Can AI help automate human-easy but computer-hard data preparation tasks...

Please sign up or login with your details

Forgot password? Click here to reset