Structured Pruning for Multi-Task Deep Neural Networks

04/13/2023
by   Siddhant Garg, et al.
0

Although multi-task deep neural network (DNN) models have computation and storage benefits over individual single-task DNN models, they can be further optimized via model compression. Numerous structured pruning methods are already developed that can readily achieve speedups in single-task models, but the pruning of multi-task networks has not yet been extensively studied. In this work, we investigate the effectiveness of structured pruning on multi-task models. We use an existing single-task filter pruning criterion and also introduce an MTL-based filter pruning criterion for estimating the filter importance scores. We prune the model using an iterative pruning strategy with both pruning methods. We show that, with careful hyper-parameter tuning, architectures obtained from different pruning methods do not have significant differences in their performances across tasks when the number of parameters is similar. We also show that iterative structure pruning may not be the best way to achieve a well-performing pruned model because, at extreme pruning levels, there is a high drop in performance across all tasks. But when the same models are randomly initialized and re-trained, they show better results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2020

To filter prune, or to layer prune, that is the question

Recent advances in pruning of neural networks have made it possible to r...
research
07/16/2020

Multi-Task Pruning for Semantic Segmentation Networks

This paper focuses on channel pruning for semantic segmentation networks...
research
07/27/2021

COPS: Controlled Pruning Before Training Starts

State-of-the-art deep neural network (DNN) pruning techniques, applied o...
research
06/21/2023

Quantifying lottery tickets under label noise: accuracy, calibration, and complexity

Pruning deep neural networks is a widely used strategy to alleviate the ...
research
01/30/2023

DepGraph: Towards Any Structural Pruning

Structural pruning enables model acceleration by removing structurally-g...
research
05/23/2019

Disentangling Redundancy for Multi-Task Pruning

Can prior network pruning strategies eliminate redundancy in multiple co...
research
11/15/2017

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

This paper presents a method for adding multiple tasks to a single deep ...

Please sign up or login with your details

Forgot password? Click here to reset