Probing the limit of hydrologic predictability with the Transformer network

06/21/2023
by   Jiangtao Liu, et al.
0

For a number of years since its introduction to hydrology, recurrent neural networks like long short-term memory (LSTM) have proven remarkably difficult to surpass in terms of daily hydrograph metrics on known, comparable benchmarks. Outside of hydrology, Transformers have now become the model of choice for sequential prediction tasks, making it a curious architecture to investigate. Here, we first show that a vanilla Transformer architecture is not competitive against LSTM on the widely benchmarked CAMELS dataset, and lagged especially for the high-flow metrics due to short-term processes. However, a recurrence-free variant of Transformer can obtain mixed comparisons with LSTM, producing the same Kling-Gupta efficiency coefficient (KGE), along with other metrics. The lack of advantages for the Transformer is linked to the Markovian nature of the hydrologic prediction problem. Similar to LSTM, the Transformer can also merge multiple forcing dataset to improve model performance. While the Transformer results are not higher than current state-of-the-art, we still learned some valuable lessons: (1) the vanilla Transformer architecture is not suitable for hydrologic modeling; (2) the proposed recurrence-free modification can improve Transformer performance so future work can continue to test more of such modifications; and (3) the prediction limits on the dataset should be close to the current state-of-the-art model. As a non-recurrent model, the Transformer may bear scale advantages for learning from bigger datasets and storing knowledge. This work serves as a reference point for future modifications of the model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2023

Temporal Fusion Transformers for Streamflow Prediction: Value of Combining Attention with Recurrence

Over the past few decades, the hydrology community has witnessed notable...
research
01/12/2021

Continental-scale streamflow modeling of basins with reservoirs: a demonstration of effectiveness and a delineation of challenges

A large fraction of major waterways have dams influencing streamflow, wh...
research
09/20/2023

Transformers versus LSTMs for electronic trading

With the rapid development of artificial intelligence, long short term m...
research
12/29/2022

Unsupervised construction of representations for oil wells via Transformers

Determining and predicting reservoir formation properties for newly dril...
research
12/16/2021

Learning Bounded Context-Free-Grammar via LSTM and the Transformer:Difference and Explanations

Long Short-Term Memory (LSTM) and Transformers are two popular neural ar...
research
09/04/2019

Mogrifier LSTM

Many advances in Natural Language Processing have been based upon more e...
research
11/26/2022

How Crucial is Transformer in Decision Transformer?

Decision Transformer (DT) is a recently proposed architecture for Reinfo...

Please sign up or login with your details

Forgot password? Click here to reset