Channel Pruning In Quantization-aware Training: An Adaptive Projection-gradient Descent-shrinkage-splitting Method

04/09/2022
by   Zhijian Li, et al.
9

We propose an adaptive projection-gradient descent-shrinkage-splitting method (APGDSSM) to integrate penalty based channel pruning into quantization-aware training (QAT). APGDSSM concurrently searches weights in both the quantized subspace and the sparse subspace. APGDSSM uses shrinkage operator and a splitting technique to create sparse weights, as well as the Group Lasso penalty to push the weight sparsity into channel sparsity. In addition, we propose a novel complementary transformed l1 penalty to stabilize the training for extreme compression.

READ FULL TEXT
research
07/23/2021

Pruning Ternary Quantization

We propose pruning ternary quantization (PTQ), a simple, yet effective, ...
research
08/24/2020

Hierarchical Adaptive Lasso: Learning Sparse Neural Networks with Shrinkage via Single Stage Training

Deep neural networks achieve state-of-the-art performance in a variety o...
research
10/16/2018

Study of Sparsity-Aware Subband Adaptive Filtering Algorithms with Adjustable Penalties

We propose two sparsity-aware normalized subband adaptive filter (NSAF) ...
research
07/22/2022

Quantized Sparse Weight Decomposition for Neural Network Compression

In this paper, we introduce a novel method of neural network weight comp...
research
09/12/2019

A Channel-Pruned and Weight-Binarized Convolutional Neural Network for Keyword Spotting

We study channel number reduction in combination with weight binarizatio...
research
02/12/2023

Removing splitting/modeling error in projection/penalty methods for Navier-Stokes simulations with continuous data assimilation

We study continuous data assimilation (CDA) applied to projection and pe...
research
05/22/2020

Position-based Scaled Gradient for Model Quantization and Sparse Training

We propose the position-based scaled gradient (PSG) that scales the grad...

Please sign up or login with your details

Forgot password? Click here to reset