Continuous Learning in a Hierarchical Multiscale Neural Network

05/15/2018
by   Thomas Wolf, et al.
0

We reformulate the problem of encoding a multi-scale representation of a sequence in a language model by casting it in a continuous learning framework. We propose a hierarchical multi-scale language model in which short time-scale dependencies are encoded in the hidden state of a lower-level recurrent neural network while longer time-scale dependencies are encoded in the dynamic of the lower-level network by having a meta-learner update the weights of the lower-level neural network in an online meta-learning fashion. We use elastic weights consolidation as a higher-level to prevent catastrophic forgetting in our continuous learning framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2018

Meta-Learning a Dynamical Language Model

We consider the task of word-level language modeling and study the possi...
research
04/30/2020

Bayesian Online Meta-Learning with Laplace Approximation

Neural networks are known to suffer from catastrophic forgetting when tr...
research
06/21/2020

Hierarchical Reinforcement Learning for Deep Goal Reasoning: An Expressiveness Analysis

Hierarchical DQN (h-DQN) is a two-level architecture of feedforward neur...
research
01/07/2020

Frosting Weights for Better Continual Training

Training a neural network model can be a lifelong learning process and i...
research
09/12/2022

Online Continual Learning via the Meta-learning Update with Multi-scale Knowledge Distillation and Data Augmentation

Continual learning aims to rapidly and continually learn the current tas...
research
06/29/2020

Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory

The effectiveness of recurrent neural networks can be largely influenced...
research
05/01/2020

Selecting Informative Contexts Improves Language Model Finetuning

We present a general finetuning meta-method that we call information gai...

Please sign up or login with your details

Forgot password? Click here to reset