GhostNet: More Features from Cheap Operations

11/27/2019
by   Kai Han, et al.
23

Deploying convolutional neural networks (CNNs) on embedded devices is difficult due to the limited memory and computation resources. The redundancy in feature maps is an important characteristic of those successful CNNs, but has rarely been investigated in neural architecture design. This paper proposes a novel Ghost module to generate more feature maps from cheap operations. Based on a set of intrinsic feature maps, we apply a series of linear transformations with cheap cost to generate many ghost feature maps that could fully reveal information underlying intrinsic features. The proposed Ghost module can be taken as a plug-and-play component to upgrade existing convolutional neural networks. Ghost bottlenecks are designed to stack Ghost modules, and then the lightweight GhostNet can be easily established. Experiments conducted on benchmarks demonstrate that the proposed Ghost module is an impressive alternative of convolution layers in baseline models, and our GhostNet can achieve higher recognition performance (75.7% top-1 accuracy) than MobileNetV3 with similar computational cost on the ImageNet ILSVRC-2012 classification dataset. Code is available at https://github.com/iamhankai/ghostnet.

READ FULL TEXT

page 1

page 6

research
01/10/2022

GhostNets on Heterogeneous Devices via Cheap Operations

Deploying convolutional neural networks (CNNs) on mobile devices is diff...
research
03/16/2020

SlimConv: Reducing Channel Redundancy in Convolutional Neural Networks by Weights Flipping

The channel redundancy in feature maps of convolutional neural networks ...
research
12/22/2021

Ghost-dil-NetVLAD: A Lightweight Neural Network for Visual Place Recognition

Visual place recognition (VPR) is a challenging task with the unbalance ...
research
07/19/2022

RepBNN: towards a precise Binary Neural Network with Enhanced Feature Map via Repeating

Binary neural network (BNN) is an extreme quantization version of convol...
research
11/11/2022

RepGhost: A Hardware-Efficient Ghost Module via Re-parameterization

Feature reuse has been a key technique in light-weight convolutional neu...
research
07/13/2021

Combining 3D Image and Tabular Data via the Dynamic Affine Feature Map Transform

Prior work on diagnosing Alzheimer's disease from magnetic resonance ima...
research
11/12/2019

Trainable Spectrally Initializable Matrix Transformations in Convolutional Neural Networks

In this work, we investigate the application of trainable and spectrally...

Please sign up or login with your details

Forgot password? Click here to reset