SITHCon: A neural network robust to variations in input scaling on the time dimension

07/09/2021
by   Brandon G. Jacques, et al.
0

In machine learning, convolutional neural networks (CNNs) have been extremely influential in both computer vision and in recognizing patterns extended over time. In computer vision, part of the flexibility arises from the use of max-pooling operations over the convolutions to attain translation invariance. In the mammalian brain, neural representations of time use a set of temporal basis functions. Critically, these basis functions appear to be arranged in a geometric series such that the basis set is evenly distributed over logarithmic time. This paper introduces a Scale-Invariant Temporal History Convolution network (SITHCon) that uses a logarithmically-distributed temporal memory. A max-pool over a logarithmically-distributed temporal memory results in scale-invariance in time. We compare performance of SITHCon to a Temporal Convolution Network (TCN) and demonstrate that, although both networks can learn classification and regression problems on both univariate and multivariate time series f(t), only SITHCon has the property that it generalizes without retraining to rescaled versions of the input f(at). This property, inspired by findings from neuroscience and psychology, could lead to large-scale networks with dramatically different capabilities, including faster training and greater generalizability, even with significantly fewer free parameters.

READ FULL TEXT

page 3

page 5

research
04/09/2021

DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales

Extracting temporal relationships over a range of scales is a hallmark o...
research
10/12/2021

Convolutional Neural Networks Are Not Invariant to Translation, but They Can Learn to Be

When seeing a new object, humans can immediately recognize it across dif...
research
11/28/2019

Patch Reordering: a Novel Way to Achieve Rotation and Translation Invariance in Convolutional Neural Networks

Convolutional Neural Networks (CNNs) have demonstrated state-of-the-art ...
research
05/23/2023

Sorted Convolutional Network for Achieving Continuous Rotational Invariance

The topic of achieving rotational invariance in convolutional neural net...
research
06/01/2015

Imaging Time-Series to Improve Classification and Imputation

Inspired by recent successes of deep learning in computer vision, we pro...
research
11/29/2021

SPIN: Simplifying Polar Invariance for Neural networks Application to vision-based irradiance forecasting

Translational invariance induced by pooling operations is an inherent pr...
research
10/14/2022

Convolutional Neural Networks: Basic Concepts and Applications in Manufacturing

We discuss basic concepts of convolutional neural networks (CNNs) and ou...

Please sign up or login with your details

Forgot password? Click here to reset