COLAM: Co-Learning of Deep Neural Networks and Soft Labels via Alternating Minimization

04/26/2020
by   Xingjian Li, et al.
0

Softening labels of training datasets with respect to data representations has been frequently used to improve the training of deep neural networks (DNNs). While such a practice has been studied as a way to leverage privileged information about the distribution of the data, a well-trained learner with soft classification outputs should be first obtained as a prior to generate such privileged information. To solve such chicken-egg problem, we propose COLAM framework that Co-Learns DNNs and soft labels through Alternating Minimization of two objectives - (a) the training loss subject to soft labels and (b) the objective to learn improved soft labels - in one end-to-end training procedure. We performed extensive experiments to compare our proposed method with a series of baselines. The experiment results show that COLAM achieves improved performance on many tasks with better testing classification accuracy. We also provide both qualitative and quantitative analyses that explain why COLAM works well.

READ FULL TEXT

page 8

page 9

research
03/19/2021

MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels

Real-world datasets commonly have noisy labels, which negatively affects...
research
07/07/2020

Soft Labeling Affects Out-of-Distribution Detection of Deep Neural Networks

Soft labeling becomes a common output regularization for generalization ...
research
06/24/2020

Retrospective Loss: Looking Back to Improve Training of Deep Neural Networks

Deep neural networks (DNNs) are powerful learning machines that have ena...
research
11/25/2020

Delving Deep into Label Smoothing

Label smoothing is an effective regularization tool for deep neural netw...
research
09/09/2020

Tunable Subnetwork Splitting for Model-parallelism of Neural Network Training

Alternating minimization methods have recently been proposed as alternat...
research
02/09/2023

Constrained Empirical Risk Minimization: Theory and Practice

Deep Neural Networks (DNNs) are widely used for their ability to effecti...
research
07/02/2022

Eliciting and Learning with Soft Labels from Every Annotator

The labels used to train machine learning (ML) models are of paramount i...

Please sign up or login with your details

Forgot password? Click here to reset