Temporal Subsampling Diminishes Small Spatial Scales in Recurrent Neural Network Emulators of Geophysical Turbulence

04/28/2023
by   Timothy A. Smith, et al.
0

The immense computational cost of traditional numerical weather and climate models has sparked the development of machine learning (ML) based emulators. Because ML methods benefit from long records of training data, it is common to use datasets that are temporally subsampled relative to the time steps required for the numerical integration of differential equations. Here, we investigate how this often overlooked processing step affects the quality of an emulator's predictions. We implement two ML architectures from a class of methods called reservoir computing: (1) a form of Nonlinear Vector Autoregression (NVAR), and (2) an Echo State Network (ESN). Despite their simplicity, it is well documented that these architectures excel at predicting low dimensional chaotic dynamics. We are therefore motivated to test these architectures in an idealized setting of predicting high dimensional geophysical turbulence as represented by Surface Quasi-Geostrophic dynamics. In all cases, subsampling the training data consistently leads to an increased bias at small spatial scales that resembles numerical diffusion. Interestingly, the NVAR architecture becomes unstable when the temporal resolution is increased, indicating that the polynomial based interactions are insufficient at capturing the detailed nonlinearities of the turbulent flow. The ESN architecture is found to be more robust, suggesting a benefit to the more expensive but more general structure. Spectral errors are reduced by including a penalty on the kinetic energy density spectrum during training, although the subsampling related errors persist. Future work is warranted to understand how the temporal resolution of training data affects other ML architectures.

READ FULL TEXT

page 4

page 6

page 9

page 11

page 16

page 17

research
12/16/2020

Copula-based synthetic data generation for machine learning emulators in weather and climate: application to a simple radiation model

Can we improve machine learning (ML) emulators with synthetic data? The ...
research
08/22/2020

Hierarchical Deep Learning of Multiscale Differential Equation Time-Steppers

Nonlinear differential equations rarely admit closed-form solutions, thu...
research
04/29/2019

Recurrent Neural Networks in the Eye of Differential Equations

To understand the fundamental trade-offs between training stability, tem...
research
09/30/2020

Using Machine Learning to Augment Coarse-Grid Computational Fluid Dynamics Simulations

Simulation of turbulent flows at high Reynolds number is a computational...
research
11/09/2022

Stabilizing Machine Learning Prediction of Dynamics: Noise and Noise-inspired Regularization

Recent work has shown that machine learning (ML) models can be trained t...
research
02/08/2020

Manifold for Machine Learning Assurance

The increasing use of machine-learning (ML) enabled systems in critical ...
research
11/24/2019

On the Robustness of Deep Learning-predicted Contention Models for Network Calculus

The network calculus (NC) analysis takes a simple model consisting of a ...

Please sign up or login with your details

Forgot password? Click here to reset