OLÉ: Orthogonal Low-rank Embedding, A Plug and Play Geometric Loss for Deep Learning

12/05/2017
by   José Lezama, et al.
0

Deep neural networks trained using a softmax layer at the top and the cross-entropy loss are ubiquitous tools for image classification. Yet, this does not naturally enforce intra-class similarity nor inter-class margin of the learned deep representations. To simultaneously achieve these two goals, different solutions have been proposed in the literature, such as the pairwise or triplet losses. However, such solutions carry the extra task of selecting pairs or triplets, and the extra computational burden of computing and learning for many combinations of them. In this paper, we propose a plug-and-play loss term for deep networks that explicitly reduces intra-class variance and enforces inter-class margin simultaneously, in a simple and elegant geometric manner. For each class, the deep features are collapsed into a learned linear subspace, or union of them, and inter-class subspaces are pushed to be as orthogonal as possible. Our proposed Orthogonal Low-rank Embedding (OLÉ) does not require carefully crafting pairs or triplets of samples for training, and works standalone as a classification loss, being the first reported deep metric learning framework of its kind. Because of the improved margin between features of different classes, the resulting deep networks generalize better, are more discriminative, and more robust. We demonstrate improved classification performance in general object recognition, plugging the proposed loss term into existing off-the-shelf architectures. In particular, we show the advantage of the proposed loss in the small data/model scenario, and we significantly advance the state-of-the-art on the Stanford STL-10 benchmark.

READ FULL TEXT
research
06/18/2019

Margin Matters: Towards More Discriminative Deep Neural Network Embeddings for Speaker Recognition

Recently, speaker embeddings extracted from a speaker discriminative dee...
research
05/09/2017

Deep Person Re-Identification with Improved Embedding and Efficient Training

Person re-identification task has been greatly boosted by deep convoluti...
research
06/03/2019

Rethinking Loss Design for Large-scale 3D Shape Retrieval

Learning discriminative shape representations is a crucial issue for lar...
research
05/18/2018

Stop memorizing: A data-dependent regularization framework for intrinsic pattern learning

Deep neural networks (DNNs) typically have enough capacity to fit random...
research
03/25/2021

Orthogonal Projection Loss

Deep neural networks have achieved remarkable performance on a range of ...
research
01/28/2022

HSADML: Hyper-Sphere Angular Deep Metric based Learning for Brain Tumor Classification

Brain Tumors are abnormal mass of clustered cells penetrating regions of...
research
06/23/2022

Learning Towards the Largest Margins

One of the main challenges for feature representation in deep learning-b...

Please sign up or login with your details

Forgot password? Click here to reset