DeepAI AI Chat
Log In Sign Up

Role of Orthogonality Constraints in Improving Properties of Deep Networks for Image Classification

by   Hongjun Choi, et al.

Standard deep learning models that employ the categorical cross-entropy loss are known to perform well at image classification tasks. However, many standard models thus obtained often exhibit issues like feature redundancy, low interpretability, and poor calibration. A body of recent work has emerged that has tried addressing some of these challenges by proposing the use of new regularization functions in addition to the cross-entropy loss. In this paper, we present some surprising findings that emerge from exploring the role of simple orthogonality constraints as a means of imposing physics-motivated constraints common in imaging. We propose an Orthogonal Sphere (OS) regularizer that emerges from physics-based latent-representations under simplifying assumptions. Under further simplifying assumptions, the OS constraint can be written in closed-form as a simple orthonormality term and be used along with the cross-entropy loss function. The findings indicate that orthonormality loss function results in a) rich and diverse feature representations, b) robustness to feature sub-selection, c) better semantic localization in the class activation maps, and d) reduction in model calibration error. We demonstrate the effectiveness of the proposed OS regularization by providing quantitative and qualitative results on four benchmark datasets - CIFAR10, CIFAR100, SVHN and tiny ImageNet.


page 2

page 4

page 6

page 7


Taming the Cross Entropy Loss

We present the Tamed Cross Entropy (TCE) loss function, a robust derivat...

AMC-Loss: Angular Margin Contrastive Loss for Improved Explainability in Image Classification

Deep-learning architectures for classification problems involve the cros...

What's in a Loss Function for Image Classification?

It is common to use the softmax cross-entropy loss to train neural netwo...

Loss Functions for Classification using Structured Entropy

Cross-entropy loss is the standard metric used to train classification m...

SimLoss: Class Similarities in Cross Entropy

One common loss function in neural network classification tasks is Categ...

Forced Spatial Attention for Driver Foot Activity Classification

This paper provides a simple solution for reliably solving image classif...

Understanding Square Loss in Training Overparametrized Neural Network Classifiers

Deep learning has achieved many breakthroughs in modern classification t...