Learning Discriminative Features Via Weights-biased Softmax Loss

04/25/2019
by   XiaoBin Li, et al.
0

Loss functions play a key role in training superior deep neural networks. In convolutional neural networks (CNNs), the popular cross entropy loss together with softmax does not explicitly guarantee minimization of intra-class variance or maximization of inter-class variance. In the early studies, there is no theoretical analysis and experiments explicitly indicating how to choose the number of units in fully connected layer. To help CNNs learn features more fast and discriminative, there are two contributions in this paper. First, we determine the minimum number of units in FC layer by rigorous theoretical analysis and extensive experiment, which reduces CNNs' parameter memory and training time. Second, we propose a negative-focused weights-biased softmax (W-Softmax) loss to help CNNs learn more discriminative features. The proposed W-Softmax loss not only theoretically formulates the intraclass compactness and inter-class separability, but also can avoid overfitting by enlarging decision margins. Moreover, the size of decision margins can be flexibly controlled by adjusting a hyperparameter α. Extensive experimental results on several benchmark datasets show the superiority of W-Softmax in image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2018

Git Loss for Deep Face Recognition

Convolutional Neural Networks (CNNs) have been widely used in computer v...
research
06/06/2017

Deep Convolutional Decision Jungle for Image Classification

We propose a novel method called deep convolutional decision jungle (CDJ...
research
04/22/2018

Anchor-based Nearest Class Mean Loss for Convolutional Neural Networks

Discriminative features are critical for machine learning applications. ...
research
12/17/2019

Angular Learning: Toward Discriminative Embedded Features

The margin-based softmax loss functions greatly enhance intra-class comp...
research
01/15/2023

Maximally Compact and Separated Features with Regular Polytope Networks

Convolutional Neural Networks (CNNs) trained with the Softmax loss are w...
research
04/20/2020

OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax Layer

A deep neural network of multiple nonlinear layers forms a large functio...
research
06/30/2019

Learning to Find Correlated Features by Maximizing Information Flow in Convolutional Neural Networks

Training convolutional neural networks for image classification tasks us...

Please sign up or login with your details

Forgot password? Click here to reset