Learning where to learn: Gradient sparsity in meta and continual learning

10/27/2021
by   Johannes von Oswald, et al.
17

Finding neural network weights that generalize well from small datasets is difficult. A promising approach is to learn a weight initialization such that a small number of weight changes results in low generalization error. We show that this form of meta-learning can be improved by letting the learning algorithm decide which weights to change, i.e., by learning where to learn. We find that patterned sparsity emerges from this process, with the pattern of sparsity varying on a problem-by-problem basis. This selective sparsity results in better generalization and less interference in a range of few-shot and continual learning problems. Moreover, we find that sparse learning also emerges in a more expressive model where learning rates are meta-learned. Our results shed light on an ongoing debate on whether meta-learning can discover adaptable features and suggest that learning by sparse gradient descent is a powerful inductive bias for meta-learning systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/27/2020

La-MAML: Look-ahead Meta Learning for Continual Learning

The continual learning problem involves training models with limited cap...
research
02/11/2021

Reproducibility Report: La-MAML: Look-ahead Meta Learning for Continual Learning

The Continual Learning (CL) problem involves performing well on a sequen...
research
12/29/2020

Meta Learning Backpropagation And Improving It

Many concepts have been proposed for meta learning with neural networks ...
research
03/02/2017

Meta Networks

Neural networks have been successfully applied in applications with a la...
research
11/13/2020

Testing the Genomic Bottleneck Hypothesis in Hebbian Meta-Learning

Recent work has shown promising results using Hebbian meta-learning to s...
research
01/25/2022

Representation learnt by SGD and Adaptive learning rules – Conditions that Vary Sparsity and Selectivity in Neural Network

From the point of view of the human brain, continual learning can perfor...
research
09/08/2016

Learning to learn with backpropagation of Hebbian plasticity

Hebbian plasticity is a powerful principle that allows biological brains...

Please sign up or login with your details

Forgot password? Click here to reset