Discrete Function Bases and Convolutional Neural Networks

03/09/2021
by   Andreas Stöckel, et al.
0

We discuss the notion of "discrete function bases" with a particular focus on the discrete basis derived from the Legendre Delay Network (LDN). We characterize the performance of these bases in a delay computation task, and as fixed temporal convolutions in neural networks. Networks using fixed temporal convolutions are conceptually simple and yield state-of-the-art results in tasks such as psMNIST. Main Results (1) We present a numerically stable algorithm for constructing a matrix of DLOPs L in O(qN) (2) The Legendre Delay Network (LDN) can be used to form a discrete function basis with a basis transformation matrix H in O(qN). (3) If q < 300, convolving with the LDN basis online has a lower run-time complexity than convolving with arbitrary FIR filters. (4) Sliding window transformations exist for some bases (Haar, cosine, Fourier) and require O(q) operations per sample and O(N) memory. (5) LTI systems similar to the LDN can be constructed for many discrete function bases; the LDN system is superior in terms of having a finite impulse response. (6) We compare discrete function bases by linearly decoding delays from signals represented with respect to these bases. Results are depicted in Figure 20. Overall, decoding errors are similar. The LDN basis has the highest and the Fourier and cosine bases have the smallest errors. (7) The Fourier and cosine bases feature a uniform decoding error for all delays. These bases should be used if the signal can be represented well in the Fourier domain. (8) Neural network experiments suggest that fixed temporal convolutions can outperform learned convolutions. The basis choice is not critical; we roughly observe the same performance trends as in the delay task. (9) The LDN is the right choice for small q, if the O(q) Euler update is feasible, and if the low O(q) memory requirement is of importance.

READ FULL TEXT

page 21

page 29

page 40

page 41

research
02/26/2021

Constructing Dampened LTI Systems Generating Polynomial Bases

We present an alternative derivation of the LTI system underlying the Le...
research
09/01/2023

Adaptive function approximation based on the Discrete Cosine Transform (DCT)

This paper studies the cosine as basis function for the approximation of...
research
07/24/2023

On Privileged and Convergent Bases in Neural Network Representations

In this study, we investigate whether the representations learned by neu...
research
02/12/2018

DCFNet: Deep Neural Network with Decomposed Convolutional Filters

Filters in a Convolutional Neural Network (CNN) contain model parameters...
research
06/15/2023

ANOVA approximation with mixed tensor product basis on scattered points

In this paper we consider an orthonormal basis, generated by a tensor pr...
research
06/04/2016

A Tale of Two Bases: Local-Nonlocal Regularization on Image Patches with Convolution Framelets

We propose an image representation scheme combining the local and nonloc...
research
03/12/2021

Machine Learning Assisted Orthonormal Basis Selection for Functional Data Analysis

In implementations of the functional data methods, the effect of the ini...

Please sign up or login with your details

Forgot password? Click here to reset