SWAMP: Sparse Weight Averaging with Multiple Particles for Iterative Magnitude Pruning

05/24/2023
by   Moonseok Choi, et al.
0

Given the ever-increasing size of modern neural networks, the significance of sparse architectures has surged due to their accelerated inference speeds and minimal memory demands. When it comes to global pruning techniques, Iterative Magnitude Pruning (IMP) still stands as a state-of-the-art algorithm despite its simple nature, particularly in extremely sparse regimes. In light of the recent finding that the two successive matching IMP solutions are linearly connected without a loss barrier, we propose Sparse Weight Averaging with Multiple Particles (SWAMP), a straightforward modification of IMP that achieves performance comparable to an ensemble of two IMP solutions. For every iteration, we concurrently train multiple sparse models, referred to as particles, using different batch orders yet the same matching ticket, and then weight average such models to produce a single mask. We demonstrate that our method consistently outperforms existing baselines across different sparsities through extensive experiments on various data and neural network structures.

READ FULL TEXT

page 5

page 17

research
06/29/2023

Sparse Model Soups: A Recipe for Improved Pruning via Model Averaging

Neural networks can be significantly compressed by pruning, leading to s...
research
04/18/2021

Lottery Jackpots Exist in Pre-trained Models

Network pruning is an effective approach to reduce network complexity wi...
research
10/15/2020

A Deeper Look at the Layerwise Sparsity of Magnitude-based Pruning

Recent discoveries on neural network pruning reveal that, with a careful...
research
10/06/2022

Unmasking the Lottery Ticket Hypothesis: What's Encoded in a Winning Ticket's Mask?

Modern deep learning involves training costly, highly overparameterized ...
research
04/19/2023

Network Pruning Spaces

Network pruning techniques, including weight pruning and filter pruning,...
research
08/23/2022

Lottery Pools: Winning More by Interpolating Tickets without Increasing Training or Inference Cost

Lottery tickets (LTs) is able to discover accurate and sparse subnetwork...
research
03/28/2023

Randomly Initialized Subnetworks with Iterative Weight Recycling

The Multi-Prize Lottery Ticket Hypothesis posits that randomly initializ...

Please sign up or login with your details

Forgot password? Click here to reset