Lower-bounded proper losses for weakly supervised classification

03/04/2021
by   Shuhei M. Yoshida, et al.
0

This paper discusses the problem of weakly supervised learning of classification, in which instances are given weak labels that are produced by some label-corruption process. The goal is to derive conditions under which loss functions for weak-label learning are proper and lower-bounded – two essential requirements for the losses used in class-probability estimation. To this end, we derive a representation theorem for proper losses in supervised learning, which dualizes the Savage representation. We use this theorem to characterize proper weak-label losses and find a condition for them to be lower-bounded. Based on these theoretical findings, we derive a novel regularization scheme called generalized logit squeezing, which makes any proper weak-label loss bounded from below, without losing properness. Furthermore, we experimentally demonstrate the effectiveness of our proposed approach, as compared to improper or unbounded losses. Those results highlight the importance of properness and lower-boundedness. The code is publicly available at https://github.com/yoshum/lower-bounded-proper-losses.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2022

Losses over Labels: Weakly Supervised Learning via Direct Loss Construction

Owing to the prohibitive costs of generating large amounts of labeled da...
research
12/06/2018

Theoretical Guarantees of Deep Embedding Losses Under Label Noise

Collecting labeled data to train deep neural networks is costly and even...
research
02/08/2016

Loss factorization, weakly supervised learning and label noise robustness

We prove that the empirical risk of most well-known loss functions facto...
research
06/11/2021

On the Robustness of Average Losses for Partial-Label Learning

Partial-label (PL) learning is a typical weakly supervised classificatio...
research
03/18/2022

PRBoost: Prompt-Based Rule Discovery and Boosting for Interactive Weakly-Supervised Learning

Weakly-supervised learning (WSL) has shown promising results in addressi...
research
06/18/2021

Being Properly Improper

In today's ML, data can be twisted (changed) in various ways, either for...
research
06/23/2023

On Learning Latent Models with Multi-Instance Weak Supervision

We consider a weakly supervised learning scenario where the supervision ...

Please sign up or login with your details

Forgot password? Click here to reset