Deep Model Compression via Filter Auto-sampling

07/12/2019
by   Daquan Zhou, et al.
1

The recent WSNet [1] is a new model compression method through sampling filterweights from a compact set and has demonstrated to be effective for 1D convolutionneural networks (CNNs). However, the weights sampling strategy of WSNet ishandcrafted and fixed which may severely limit the expression ability of the resultedCNNs and weaken its compression ability. In this work, we present a novel auto-sampling method that is applicable to both 1D and 2D CNNs with significantperformance improvement over WSNet. Specifically, our proposed auto-samplingmethod learns the sampling rules end-to-end instead of being independent of thenetwork architecture design. With such differentiable weight sampling rule learning,the sampling stride and channel selection from the compact set are optimized toachieve better trade-off between model compression rate and performance. Wedemonstrate that at the same compression ratio, our method outperforms WSNetby6.5 method outperformsMobileNetV2 full model by1.47 with25 our methodeven outperforms some neural architecture search (NAS) based methods such asAMC [2] and MNasNet [3].

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2023

DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit CNNs

Neural architecture search (NAS) proves to be among the effective approa...
research
09/30/2021

DAAS: Differentiable Architecture and Augmentation Policy Search

Neural architecture search (NAS) has been an active direction of automat...
research
04/14/2021

End-to-end Keyword Spotting using Neural Architecture Search and Quantization

This paper introduces neural architecture search (NAS) for the automatic...
research
11/15/2019

ASCAI: Adaptive Sampling for acquiring Compact AI

This paper introduces ASCAI, a novel adaptive sampling methodology that ...
research
06/18/2020

DrNAS: Dirichlet Neural Architecture Search

This paper proposes a novel differentiable architecture search method by...
research
06/06/2019

StyleNAS: An Empirical Study of Neural Architecture Search to Uncover Surprisingly Fast End-to-End Universal Style Transfer Networks

Neural Architecture Search (NAS) has been widely studied for designing d...
research
03/23/2020

Learning a Probabilistic Strategy for Computational Imaging Sensor Selection

Optimized sensing is important for computational imaging in low-resource...

Please sign up or login with your details

Forgot password? Click here to reset