Evaluation of Neural Architectures Trained with Square Loss vs Cross-Entropy in Classification Tasks

06/12/2020
by   Like Hui, et al.
0

Modern neural architectures for classification tasks are trained using the cross-entropy loss, which is believed to be empirically superior to the square loss. In this work we provide evidence indicating that this belief may not be well-founded. We explore several major neural architectures and a range of standard benchmark datasets for NLP, automatic speech recognition (ASR) and computer vision tasks to show that these architectures, with the same hyper-parameter settings as reported in the literature, perform comparably or better when trained with the square loss, even after equalizing computational resources. Indeed, we observe that the square loss produces better results in the dominant majority of NLP and ASR experiments. Cross-entropy appears to have a slight edge on computer vision tasks. We argue that there is little compelling empirical or theoretical evidence indicating a clear-cut advantage to the cross-entropy loss. Indeed, in our experiments, performance on nearly all non-vision tasks can be improved, sometimes significantly, by switching to the square loss. We posit that training using the square loss for classification needs to be a part of best practices of modern deep learning on equal footing with cross-entropy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

Cut your Losses with Squentropy

Nearly all practical neural models for classification are trained using ...
research
07/28/2021

United We Learn Better: Harvesting Learning Improvements From Class Hierarchies Across Tasks

Attempts of learning from hierarchical taxonomies in computer vision hav...
research
12/07/2021

Understanding Square Loss in Training Overparametrized Neural Network Classifiers

Deep learning has achieved many breakthroughs in modern classification t...
research
12/08/2020

Reinforcement Based Learning on Classification Task Could Yield Better Generalization and Adversarial Accuracy

Deep Learning has become interestingly popular in computer vision, mostl...
research
09/12/2021

Mixing between the Cross Entropy and the Expectation Loss Terms

The cross entropy loss is widely used due to its effectiveness and solid...
research
11/10/2020

Uses and Abuses of the Cross-Entropy Loss: Case Studies in Modern Deep Learning

Modern deep learning is primarily an experimental science, in which empi...
research
11/10/2022

Regression as Classification: Influence of Task Formulation on Neural Network Features

Neural networks can be trained to solve regression problems by using gra...

Please sign up or login with your details

Forgot password? Click here to reset