Task Agnostic Continual Learning via Meta Learning

06/12/2019
by   Xu He, et al.
0

While neural networks are powerful function approximators, they suffer from catastrophic forgetting when the data distribution is not stationary. One particular formalism that studies learning under non-stationary distribution is provided by continual learning, where the non-stationarity is imposed by a sequence of distinct tasks. Most methods in this space assume, however, the knowledge of task boundaries, and focus on alleviating catastrophic forgetting. In this work, we depart from this view and move the focus towards faster remembering -- i.e measuring how quickly the network recovers performance rather than measuring the network's performance without any adaptation. We argue that in many settings this can be more effective and that it opens the door to combining meta-learning and continual learning techniques, leveraging their complementary advantages. We propose a framework specific for the scenario where no information about task boundaries or task identity is given. It relies on a separation of concerns into what task is being solved and how the task should be solved. This framework is implemented by differentiating task specific parameters from task agnostic parameters, where the latter are optimized in a continual meta learning fashion, without access to multiple tasks at the same time. We showcase this framework in a supervised learning scenario and discuss the implication of the proposed formalism.

READ FULL TEXT

page 7

page 8

research
04/08/2022

Learning to modulate random weights can induce task-specific contexts for economical meta and continual learning

Neural networks are vulnerable to catastrophic forgetting when data dist...
research
03/12/2020

Online Fast Adaptation and Knowledge Accumulation: a New Approach to Continual Learning

Learning from non-stationary data remains a great challenge for machine ...
research
10/01/2020

Task Agnostic Continual Learning Using Online Variational Bayes with Fixed-Point Updates

Background: Catastrophic forgetting is the notorious vulnerability of ne...
research
11/12/2019

Learning from the Past: Continual Meta-Learning via Bayesian Graph Modeling

Meta-learning for few-shot learning allows a machine to leverage previou...
research
03/01/2021

Posterior Meta-Replay for Continual Learning

Continual Learning (CL) algorithms have recently received a lot of atten...
research
03/06/2021

Learning to Continually Learn Rapidly from Few and Noisy Data

Neural networks suffer from catastrophic forgetting and are unable to se...
research
11/25/2020

Continual learning with direction-constrained optimization

This paper studies a new design of the optimization algorithm for traini...

Please sign up or login with your details

Forgot password? Click here to reset