Latent Hinge-Minimax Risk Minimization for Inference from a Small Number of Training Samples

02/04/2017
by   Dolev Raviv, et al.
0

Deep Learning (DL) methods show very good performance when trained on large, balanced data sets. However, many practical problems involve imbalanced data sets, or/and classes with a small number of training samples. The performance of DL methods as well as more traditional classifiers drops significantly in such settings. Most of the existing solutions for imbalanced problems focus on customizing the data for training. A more principled solution is to use mixed Hinge-Minimax risk [19] specifically designed to solve binary problems with imbalanced training sets. Here we propose a Latent Hinge Minimax (LHM) risk and a training algorithm that generalizes this paradigm to an ensemble of hyperplanes that can form arbitrary complex, piecewise linear boundaries. To extract good features, we combine LHM model with CNN via transfer learning. To solve multi-class problem we map pre-trained category-specific LHM classifiers to a multi-class neural network and adjust the weights with very fast tuning. LHM classifier enables the use of unlabeled data in its training and the mapping allows for multi-class inference, resulting in a classifier that performs better than alternatives when trained on a small number of training samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2022

Class-Imbalanced Complementary-Label Learning via Weighted Loss

Complementary-label learning (CLL) is a common application in the scenar...
research
05/24/2022

Deep Reinforcement Learning for Multi-class Imbalanced Training

With the rapid growth of memory and computing power, datasets are becomi...
research
03/11/2018

On dynamic ensemble selection and data preprocessing for multi-class imbalance learning

Class-imbalance refers to classification problems in which many more ins...
research
11/22/2018

ICPRAI 2018 SI: On dynamic ensemble selection and data preprocessing for multi-class imbalance learning

Class-imbalance refers to classification problems in which many more ins...
research
06/22/2020

DeepTopPush: Simple and Scalable Method for Accuracy at the Top

Accuracy at the top is a special class of binary classification problems...
research
04/21/2020

Improving Positive Unlabeled Learning: Practical AUL Estimation and New Training Method for Extremely Imbalanced Data Sets

Positive Unlabeled (PU) learning is widely used in many applications, wh...
research
10/17/2022

Understanding CNN Fragility When Learning With Imbalanced Data

Convolutional neural networks (CNNs) have achieved impressive results on...

Please sign up or login with your details

Forgot password? Click here to reset