Bias Loss for Mobile Neural Networks

07/23/2021
by   Lusine Abrahamyan, et al.
0

Compact convolutional neural networks (CNNs) have witnessed exceptional improvements in performance in recent years. However, they still fail to provide the same predictive power as CNNs with a large number of parameters. The diverse and even abundant features captured by the layers is an important characteristic of these successful CNNs. However, differences in this characteristic between large CNNs and their compact counterparts have rarely been investigated. In compact CNNs, due to the limited number of parameters, abundant features are unlikely to be obtained, and feature diversity becomes an essential characteristic. Diverse features present in the activation maps derived from a data point during model inference may indicate the presence of a set of unique descriptors necessary to distinguish between objects of different classes. In contrast, data points with low feature diversity may not provide a sufficient amount of unique descriptors to make a valid prediction; we refer to them as random predictions. Random predictions can negatively impact the optimization process and harm the final performance. This paper proposes addressing the problem raised by random predictions by reshaping the standard cross-entropy to make it biased toward data points with a limited number of unique descriptive features. Our novel Bias Loss focuses the training on a set of valuable data points and prevents the vast number of samples with poor learning features from misleading the optimization process. Furthermore, to show the importance of diversity, we present a family of SkipNet models whose architectures are brought to boost the number of unique descriptors in the last layers. Our Skipnet-M can achieve 1 MobileNetV3 Large.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2019

LeanResNet: A Low-cost yet Effective Convolutional Residual Networks

Convolutional Neural Networks (CNNs) filter the input data using a serie...
research
10/15/2017

CNNComparator: Comparative Analytics of Convolutional Neural Networks

Convolutional neural networks (CNNs) are widely used in many image recog...
research
05/31/2022

Exact Feature Collisions in Neural Networks

Predictions made by deep neural networks were shown to be highly sensiti...
research
04/03/2019

Hybrid Cosine Based Convolutional Neural Networks

Convolutional neural networks (CNNs) have demonstrated their capability ...
research
07/09/2021

Joint Matrix Decomposition for Deep Convolutional Neural Networks Compression

Deep convolutional neural networks (CNNs) with a large number of paramet...
research
07/29/2019

Salient Slices: Improved Neural Network Training and Performance with Image Entropy

As a training and analysis strategy for convolutional neural networks (C...
research
01/15/2022

Hyperplane bounds for neural feature mappings

Deep learning methods minimise the empirical risk using loss functions s...

Please sign up or login with your details

Forgot password? Click here to reset