Meta-Learning with Network Pruning

07/07/2020
by   Hongduan Tian, et al.
0

Meta-learning is a powerful paradigm for few-shot learning. Although with remarkable success witnessed in many applications, the existing optimization based meta-learning models with over-parameterized neural networks have been evidenced to ovetfit on training tasks. To remedy this deficiency, we propose a network pruning based meta-learning approach for overfitting reduction via explicitly controlling the capacity of network. A uniform concentration analysis reveals the benefit of network capacity constraint for reducing generalization gap of the proposed meta-learner. We have implemented our approach on top of Reptile assembled with two network pruning routines: Dense-Sparse-Dense (DSD) and Iterative Hard Thresholding (IHT). Extensive experimental results on benchmark datasets with different over-parameterized deep networks demonstrate that our method not only effectively alleviates meta-overfitting but also in many cases improves the overall generalization performance when applied to few-shot classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2022

Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks

Few-shot learning for neural networks (NNs) is an important problem that...
research
04/06/2023

Learning to Learn with Indispensable Connections

Meta-learning aims to solve unseen tasks with few labelled instances. Ne...
research
03/25/2019

MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning

In this paper, we propose a novel meta learning approach for automatic c...
research
01/01/2021

B-SMALL: A Bayesian Neural Network approach to Sparse Model-Agnostic Meta-Learning

There is a growing interest in the learning-to-learn paradigm, also know...
research
06/14/2023

Improving Generalization in Meta-Learning via Meta-Gradient Augmentation

Meta-learning methods typically follow a two-loop framework, where each ...
research
09/04/2022

Generalization in Neural Networks: A Broad Survey

This paper reviews concepts, modeling approaches, and recent findings al...
research
08/07/2020

Neural Complexity Measures

While various complexity measures for diverse model classes have been pr...

Please sign up or login with your details

Forgot password? Click here to reset