Universal Approximation of Input-Output Maps by Temporal Convolutional Nets

06/21/2019
by   Joshua Hanson, et al.
0

There has been a recent shift in sequence-to-sequence modeling from recurrent network architectures to convolutional network architectures due to computational advantages in training and operation while still achieving competitive performance. For systems having limited long-term temporal dependencies, the approximation capability of recurrent networks is essentially equivalent to that of temporal convolutional nets (TCNs). We prove that TCNs can approximate a large class of input-output maps having approximately finite memory to arbitrary error tolerance. Furthermore, we derive quantitative approximation rates for deep ReLU TCNs in terms of the width and depth of the network and modulus of continuity of the original input-output map, and apply these results to input-output maps of systems that admit finite-dimensional state-space realizations (i.e., recurrent models).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2019

Input-Output Equivalence of Unitary and Contractive RNNs

Unitary recurrent neural networks (URNNs) have been proposed as a method...
research
09/23/2022

Achieve the Minimum Width of Neural Networks for Universal Approximation

The universal approximation property (UAP) of neural networks is fundame...
research
01/07/2019

Learning Nonlinear Input-Output Maps with Dissipative Quantum Systems

In this paper, we develop a theory of learning nonlinear input-output ma...
research
05/24/2022

Realization Theory Of Recurrent Neural ODEs Using Polynomial System Embeddings

In this paper we show that neural ODE analogs of recurrent (ODE-RNN) and...
research
07/19/2021

Sequence-to-Sequence Piano Transcription with Transformers

Automatic Music Transcription has seen significant progress in recent ye...
research
05/03/2023

Input-Output Feedback Linearization Preserving Task Priority for Multivariate Nonlinear Systems Having Singular Input Gain Matrix

We propose an extension of the input-output feedback linearization for a...
research
07/23/2020

Dimension reduction in recurrent networks by canonicalization

Many recurrent neural network machine learning paradigms can be formulat...

Please sign up or login with your details

Forgot password? Click here to reset