Robustness of different loss functions and their impact on networks learning capability

10/15/2021
by   Vishal Rajput, et al.
0

Recent developments in AI have made it ubiquitous, every industry is trying to adopt some form of intelligent processing of their data. Despite so many advances in the field, AIs full capability is yet to be exploited by the industry. Industries that involve some risk factors still remain cautious about the usage of AI due to the lack of trust in such autonomous systems. Present-day AI might be very good in a lot of things but it is very bad in reasoning and this behavior of AI can lead to catastrophic results. Autonomous cars crashing into a person or a drone getting stuck in a tree are a few examples where AI decisions lead to catastrophic results. To develop insight and generate an explanation about the learning capability of AI, we will try to analyze the working of loss functions. For our case, we will use two sets of loss functions, generalized loss functions like Binary cross-entropy or BCE and specialized loss functions like Dice loss or focal loss. Through a series of experiments, we will establish whether combining different loss functions is better than using a single loss function and if yes, then what is the reason behind it. In order to establish the difference between generalized loss and specialized losses, we will train several models using the above-mentioned losses and then compare their robustness on adversarial examples. In particular, we will look at how fast the accuracy of different models decreases when we change the pixels corresponding to the most salient gradients.

READ FULL TEXT

page 6

page 8

page 10

research
07/18/2019

On the relation between Loss Functions and T-Norms

Deep learning has been shown to achieve impressive results in several do...
research
04/14/2023

Cross-Entropy Loss Functions: Theoretical Analysis and Applications

Cross-entropy is a widely used loss function in applications. It coincid...
research
02/07/2021

Tilting the playing field: Dynamical loss functions for machine learning

We show that learning can be improved by using loss functions that evolv...
research
09/13/2022

On the Optimal Combination of Cross-Entropy and Soft Dice Losses for Lesion Segmentation with Out-of-Distribution Robustness

We study the impact of different loss functions on lesion segmentation f...
research
05/24/2018

Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms

We study in this paper Fenchel-Young losses, a generic way to construct ...
research
01/25/2019

Towards a Deeper Understanding of Adversarial Losses

Recent work has proposed various adversarial losses for training generat...
research
11/09/2021

Tightening the Approximation Error of Adversarial Risk with Auto Loss Function Search

Numerous studies have demonstrated that deep neural networks are easily ...

Please sign up or login with your details

Forgot password? Click here to reset