MicroNet: Improving Image Recognition with Extremely Low FLOPs

08/12/2021
by   Yunsheng Li, et al.
19

This paper aims at addressing the problem of substantial performance degradation at extremely low computational cost (e.g. 5M FLOPs on ImageNet classification). We found that two factors, sparse connectivity and dynamic activation function, are effective to improve the accuracy. The former avoids the significant reduction of network width, while the latter mitigates the detriment of reduction in network depth. Technically, we propose micro-factorized convolution, which factorizes a convolution matrix into low rank matrices, to integrate sparse connectivity into convolution. We also present a new dynamic activation function, named Dynamic Shift Max, to improve the non-linearity via maxing out multiple dynamic fusions between an input feature map and its circular channel shift. Building upon these two new operators, we arrive at a family of networks, named MicroNet, that achieves significant performance gains over the state of the art in the low FLOP regime. For instance, under the constraint of 12M FLOPs, MicroNet achieves 59.4\% top-1 accuracy on ImageNet classification, outperforming MobileNetV3 by 9.6\%. Source code is at \href{https://github.com/liyunsheng13/micronet}{https://github.com/liyunsheng13/micronet}.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2020

MicroNet: Towards Image Recognition with Extremely Low FLOPs

In this paper, we present MicroNet, which is an efficient convolutional ...
research
01/13/2023

Efficient Activation Function Optimization through Surrogate Modeling

Carefully designed activation functions can improve the performance of n...
research
03/15/2021

Revisiting Dynamic Convolution via Matrix Decomposition

Recent research in dynamic convolution shows substantial performance boo...
research
06/20/2020

Pyramidal Convolution: Rethinking Convolutional Neural Networks for Visual Recognition

This work introduces pyramidal convolution (PyConv), which is capable of...
research
03/11/2022

QDrop: Randomly Dropping Quantization for Extremely Low-bit Post-Training Quantization

Recently, post-training quantization (PTQ) has driven much attention to ...
research
08/17/2022

Restructurable Activation Networks

Is it possible to restructure the non-linear activation functions in a d...
research
03/30/2023

Invertible Convolution with Symmetric Paddings

We show that symmetrically padded convolution can be analytically invert...

Please sign up or login with your details

Forgot password? Click here to reset