Inverse Approximation Theory for Nonlinear Recurrent Neural Networks

05/30/2023
by   Shida Wang, et al.
0

We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using RNNs. This is a so-called Bernstein-type result in approximation theory, which deduces properties of a target function under the assumption that it can be effectively approximated by a hypothesis space. In particular, we show that nonlinear sequence relationships, viewed as functional sequences, that can be stably approximated by RNNs with hardtanh/tanh activations must have an exponential decaying memory structure – a notion that can be made precise. This extends the previously identified curse of memory in linear RNNs into the general nonlinear setting, and quantifies the essential limitations of the RNN architecture for learning sequential relationships with long-term memory. Based on the analysis, we propose a principled reparameterization method to overcome the limitations. Our theoretical results are confirmed by numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/16/2020

On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis

We study the approximation properties and optimization dynamics of recur...
research
05/29/2023

Approximation theory of transformer networks for sequence modeling

The transformer is a widely applied architecture in sequence modeling ap...
research
05/29/2023

Forward and Inverse Approximation Theory for Linear Temporal Convolutional Networks

We present a theoretical analysis of the approximation properties of con...
research
05/10/2023

Frequency-Supported Neural Networks for Nonlinear Dynamical System Identification

Neural networks are a very general type of model capable of learning var...
research
06/13/2016

Neural Associative Memory for Dual-Sequence Modeling

Many important NLP problems can be posed as dual-sequence or sequence-to...
research
10/25/2018

Reversible Recurrent Neural Networks

Recurrent neural networks (RNNs) provide state-of-the-art performance in...
research
01/19/2021

Implicit Bias of Linear RNNs

Contemporary wisdom based on empirical studies suggests that standard re...

Please sign up or login with your details

Forgot password? Click here to reset