Learning Representations from Temporally Smooth Data

12/12/2020
by   Shima Rahimi Moghaddam, et al.
0

Events in the real world are correlated across nearby points in time, and we must learn from this temporally smooth data. However, when neural networks are trained to categorize or reconstruct single items, the common practice is to randomize the order of training items. What are the effects of temporally smooth training data on the efficiency of learning? We first tested the effects of smoothness in training data on incremental learning in feedforward nets and found that smoother data slowed learning. Moreover, sampling so as to minimize temporal smoothness produced more efficient learning than sampling randomly. If smoothness generally impairs incremental learning, then how can networks be modified to benefit from smoothness in the training data? We hypothesized that two simple brain-inspired mechanisms, leaky memory in activation units and memory-gating, could enable networks to rapidly extract useful representations from smooth data. Across all levels of data smoothness, these brain-inspired architectures achieved more efficient category learning than feedforward networks. This advantage persisted, even when leaky memory networks with gating were trained on smooth data and tested on randomly-ordered data. Finally, we investigated how these brain-inspired mechanisms altered the internal representations learned by the networks. We found that networks with multi-scale leaky memory and memory-gating could learn internal representations that un-mixed data sources which vary on fast and slow timescales across training samples. Altogether, we identified simple mechanisms enabling neural networks to learn more quickly from temporally smooth data, and to generate internal representations that separate timescales in the training signal.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 5

page 11

page 12

page 14

page 16

page 18

page 20

11/28/2017

FearNet: Brain-Inspired Model for Incremental Learning

Incremental class learning involves sequentially learning classes in bur...
09/26/2019

Mildly Overparametrized Neural Nets can Memorize Training Data Efficiently

It has been observed zhang2016understanding that deep neural networks ca...
10/29/2020

Collaborative Method for Incremental Learning on Classification and Generation

Although well-trained deep neural networks have shown remarkable perform...
03/05/2019

Learning a smooth kernel regularizer for convolutional neural networks

Modern deep neural networks require a tremendous amount of data to train...
05/28/2018

Lifelong Learning of Spatiotemporal Representations with Dual-Memory Recurrent Self-Organization

Humans excel at continually acquiring and fine-tuning knowledge over sus...
06/13/2017

Temporally Efficient Deep Learning with Spikes

The vast majority of natural sensory data is temporally redundant. Video...
12/11/2021

Smooth-Swap: A Simple Enhancement for Face-Swapping with Smoothness

In recent years, face-swapping models have progressed in generation qual...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.