Pruning Convolutional Neural Networks with Self-Supervision

01/10/2020
by   Mathilde Caron, et al.
31

Convolutional neural networks trained without supervision come close to matching performance with supervised pre-training, but sometimes at the cost of an even higher number of parameters. Extracting subnetworks from these large unsupervised convnets with preserved performance is of particular interest to make them less computationally intensive. Typical pruning methods operate during training on a task while trying to maintain the performance of the pruned network on the same task. However, in self-supervised feature learning, the training objective is agnostic on the representation transferability to downstream tasks. Thus, preserving performance for this objective does not ensure that the pruned subnetwork remains effective for solving downstream tasks. In this work, we investigate the use of standard pruning methods, developed primarily for supervised learning, for networks trained without labels (i.e. on self-supervised tasks). We show that pruned masks obtained with or without labels reach comparable performance when re-trained on labels, suggesting that pruning operates similarly for self-supervised and supervised learning. Interestingly, we also find that pruning preserves the transfer performance of self-supervised subnetwork representations.

READ FULL TEXT
research
07/25/2022

Dynamic Channel Selection in Self-Supervised Learning

Whilst computer vision models built using self-supervised approaches are...
research
06/28/2019

Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty

Self-supervision provides effective representations for downstream tasks...
research
11/17/2022

Compressing Transformer-based self-supervised models for speech processing

Despite the success of Transformers in self-supervised learning with app...
research
10/30/2022

DyG2Vec: Representation Learning for Dynamic Graphs with Self-Supervision

The challenge in learning from dynamic graphs for predictive tasks lies ...
research
03/31/2022

PADA: Pruning Assisted Domain Adaptation for Self-Supervised Speech Representations

While self-supervised speech representation learning (SSL) models serve ...
research
12/03/2021

Self-Supervised Material and Texture Representation Learning for Remote Sensing Tasks

Self-supervised learning aims to learn image feature representations wit...
research
02/23/2022

Reconstruction Task Finds Universal Winning Tickets

Pruning well-trained neural networks is effective to achieve a promising...

Please sign up or login with your details

Forgot password? Click here to reset