Incremental Sequence Learning

11/09/2016
by   Edwin D. de Jong, et al.
0

Deep learning research over the past years has shown that by increasing the scope or difficulty of the learning problem over time, increasingly complex learning problems can be addressed. We study incremental learning in the context of sequence learning, using generative RNNs in the form of multi-layer recurrent Mixture Density Networks. While the potential of incremental or curriculum learning to enhance learning is known, indiscriminate application of the principle does not necessarily lead to improvement, and it is essential therefore to know which forms of incremental or curriculum learning have a positive effect. This research contributes to that aim by comparing three instantiations of incremental or curriculum learning. We introduce Incremental Sequence Learning, a simple incremental approach to sequence learning. Incremental Sequence Learning starts out by using only the first few steps of each sequence as training data. Each time a performance criterion has been reached, the length of the parts of the sequences used for training is increased. We introduce and make available a novel sequence learning task and data set: predicting and classifying MNIST pen stroke sequences. We find that Incremental Sequence Learning greatly speeds up sequence learning and reaches the best test performance level of regular sequence learning 20 times faster, reduces the test error by 74 variance and achieves sustained progress after all three comparison methods have stopped improving. The other instantiations of curriculum learning do not result in any noticeable improvement. A trained sequence prediction model is also used in transfer learning to the task of sequence classification, where it is found that transfer learning realizes improved classification performance compared to methods that learn to classify from scratch.

READ FULL TEXT

page 4

page 15

page 16

research
10/29/2020

Collaborative Method for Incremental Learning on Classification and Generation

Although well-trained deep neural networks have shown remarkable perform...
research
02/11/2018

Curriculum Learning by Transfer Learning: Theory and Experiments with Deep Networks

Our first contribution in this paper is a theoretical investigation of c...
research
08/11/2017

Deep Incremental Boosting

This paper introduces Deep Incremental Boosting, a new technique derived...
research
02/06/2021

A bandit approach to curriculum generation for automatic speech recognition

The Automated Speech Recognition (ASR) task has been a challenging domai...
research
07/24/2018

Improved Training with Curriculum GANs

In this paper we introduce Curriculum GANs, a curriculum learning strate...
research
04/07/2019

On The Power of Curriculum Learning in Training Deep Networks

Training neural networks is traditionally done by providing a sequence o...
research
02/02/2021

Recurrent Neural Network for MoonBoard Climbing Route Classification and Generation

Classifying the difficulties of climbing routes and generating new route...

Please sign up or login with your details

Forgot password? Click here to reset