Neural Collapse with Cross-Entropy Loss

12/15/2020
by   Jianfeng Lu, et al.
0

We consider the variational problem of cross-entropy loss with n feature vectors on a unit hypersphere in ℝ^d. We prove that when d ≥ n - 1, the global minimum is given by the simplex equiangular tight frame, which justifies the neural collapse behavior. We also show a connection with the frame potential of Benedetto Fickus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2018

Taming the Cross Entropy Loss

We present the Tamed Cross Entropy (TCE) loss function, a robust derivat...
research
07/29/2019

Multi-Frame Cross-Entropy Training for Convolutional Neural Networks in Speech Recognition

We introduce Multi-Frame Cross-Entropy training (MFCE) for convolutional...
research
12/25/2020

Adaptively Solving the Local-Minimum Problem for Deep Neural Networks

This paper aims to overcome a fundamental problem in the theory and appl...
research
04/16/2022

The Tree Loss: Improving Generalization with Many Classes

Multi-class classification problems often have many semantically similar...
research
11/23/2022

Using Focal Loss to Fight Shallow Heuristics: An Empirical Analysis of Modulated Cross-Entropy in Natural Language Inference

There is no such thing as a perfect dataset. In some datasets, deep neur...
research
06/02/2020

Cross entropy as objective function for music generative models

The election of the function to optimize when training a machine learnin...
research
03/11/2023

Generalizing and Decoupling Neural Collapse via Hyperspherical Uniformity Gap

The neural collapse (NC) phenomenon describes an underlying geometric sy...

Please sign up or login with your details

Forgot password? Click here to reset