A general approach to progressive learning

04/27/2020
by   Joshua T. Vogelstein, et al.
6

In biological learning, data is used to improve performance on the task at hand, while simultaneously improving performance on both previously encountered tasks and as yet unconsidered future tasks. In contrast, classical machine learning starts from a blank slate, or tabula rasa, using data only for the single task at hand. While typical transfer learning algorithms can improve performance on future tasks, their performance degrades upon learning new tasks. Many recent approaches have attempted to mitigate this issue, called catastrophic forgetting, to maintain performance given new tasks. But striving to avoid forgetting sets the goal unnecessarily low: the goal of progressive learning, whether biological or artificial, is to improve performance on all tasks (including past and future) with any new data. We propose a general approach to progressive learning that ensembles representations, rather than learners. We show that ensembling representations—including representations learned by decision forests or neural networks—enables both forward and backward transfer on a variety of simulated and real data tasks, including vision, language, and adversarial tasks. This work suggests that further improvements in progressive learning may follow from a deeper understanding of how biological learning achieves such high degrees of efficiency.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 4

page 5

page 6

page 8

page 9

page 13

page 16

04/27/2020

A general approach to progressive intelligence

In biological learning, data is used to improve performance on the task ...
08/07/2017

Measuring Catastrophic Forgetting in Neural Networks

Deep neural networks are used in many state-of-the-art systems for machi...
03/07/2020

Synaptic Metaplasticity in Binarized Neural Networks

While deep neural networks have surpassed human performance in multiple ...
10/10/2019

Learning to Remember from a Multi-Task Teacher

Recent studies on catastrophic forgetting during sequential learning typ...
02/22/2019

Generative Memory for Lifelong Reinforcement Learning

Our research is focused on understanding and applying biological memory ...
01/29/2022

Progressive Continual Learning for Spoken Keyword Spotting

Catastrophic forgetting is a thorny challenge when updating keyword spot...
05/26/2019

Sequential mastery of multiple tasks: Networks naturally learn to learn

We explore the behavior of a standard convolutional neural net in a sett...

Code Repositories

ProgLearn

NeuroData's package for exploring and using progressive learning algorithms


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.