Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels

05/10/2021
by   Erik Englesson, et al.
0

We propose two novel loss functions based on Jensen-Shannon divergence for learning under label noise. Following the work of Ghosh et al. (2017), we argue about their theoretical robustness. Furthermore, we reveal several other desirable properties by drawing informative connections to various loss functions, e.g., cross entropy, mean absolute error, generalized cross entropy, symmetric cross entropy, label smoothing, and most importantly consistency regularization. We conduct extensive and systematic experiments using both synthetic (CIFAR) and real (WebVision) noise and demonstrate significant and consistent improvements over other loss functions. Also, we conduct several informative side experiments that highlight the different theoretical properties.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset