Role of Orthogonality Constraints in Improving Properties of Deep Networks for Image Classification

09/22/2020
by   Hongjun Choi, et al.
16

Standard deep learning models that employ the categorical cross-entropy loss are known to perform well at image classification tasks. However, many standard models thus obtained often exhibit issues like feature redundancy, low interpretability, and poor calibration. A body of recent work has emerged that has tried addressing some of these challenges by proposing the use of new regularization functions in addition to the cross-entropy loss. In this paper, we present some surprising findings that emerge from exploring the role of simple orthogonality constraints as a means of imposing physics-motivated constraints common in imaging. We propose an Orthogonal Sphere (OS) regularizer that emerges from physics-based latent-representations under simplifying assumptions. Under further simplifying assumptions, the OS constraint can be written in closed-form as a simple orthonormality term and be used along with the cross-entropy loss function. The findings indicate that orthonormality loss function results in a) rich and diverse feature representations, b) robustness to feature sub-selection, c) better semantic localization in the class activation maps, and d) reduction in model calibration error. We demonstrate the effectiveness of the proposed OS regularization by providing quantitative and qualitative results on four benchmark datasets - CIFAR10, CIFAR100, SVHN and tiny ImageNet.

READ FULL TEXT

page 2

page 4

page 6

page 7

research
10/11/2018

Taming the Cross Entropy Loss

We present the Tamed Cross Entropy (TCE) loss function, a robust derivat...
research
04/21/2020

AMC-Loss: Angular Margin Contrastive Loss for Improved Explainability in Image Classification

Deep-learning architectures for classification problems involve the cros...
research
10/30/2020

What's in a Loss Function for Image Classification?

It is common to use the softmax cross-entropy loss to train neural netwo...
research
06/14/2022

Loss Functions for Classification using Structured Entropy

Cross-entropy loss is the standard metric used to train classification m...
research
04/26/2022

PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions

Cross-entropy loss and focal loss are the most common choices when train...
research
07/27/2019

Forced Spatial Attention for Driver Foot Activity Classification

This paper provides a simple solution for reliably solving image classif...
research
12/07/2021

Understanding Square Loss in Training Overparametrized Neural Network Classifiers

Deep learning has achieved many breakthroughs in modern classification t...

Please sign up or login with your details

Forgot password? Click here to reset