A Maximum Log-Likelihood Method for Imbalanced Few-Shot Learning Tasks

11/26/2022
by   Samuel Hess, et al.
0

Few-shot learning is a rapidly evolving area of research in machine learning where the goal is to classify unlabeled data with only one or "a few" labeled exemplary samples. Neural networks are typically trained to minimize a distance metric between labeled exemplary samples and a query set. Early few-shot approaches use an episodic training process to sub-sample the training data into few-shot batches. This training process matches the sub-sampling done on evaluation. Recently, conventional supervised training coupled with a cosine distance has achieved superior performance for few-shot. Despite the diversity of few-shot approaches over the past decade, most methods still rely on the cosine or Euclidean distance layer between the latent features of the trained network. In this work, we investigate the distributions of trained few-shot features and demonstrate that they can be roughly approximated as exponential distributions. Under this assumption of an exponential distribution, we propose a new maximum log-likelihood metric for few-shot architectures. We demonstrate that the proposed metric achieves superior performance accuracy w.r.t. conventional similarity metrics (e.g., cosine, Euclidean, etc.), and achieve state-of-the-art inductive few-shot performance. Further, additional gains can be achieved by carefully combining multiple metrics and neither of our methods require post-processing feature transformations, which are common to many algorithms. Finally, we demonstrate a novel iterative algorithm designed around our maximum log-likelihood approach that achieves state-of-the-art transductive few-shot performance when the evaluation data is imbalanced. We have made our code publicly available at https://github.com/samuelhess/MLL_FSL/.

READ FULL TEXT
research
04/27/2023

Adaptive manifold for imbalanced transductive few-shot learning

Transductive few-shot learning algorithms have showed substantially supe...
research
07/26/2021

Transductive Maximum Margin Classifier for Few-Shot Learning

Few-shot learning aims to train a classifier that can generalize well wh...
research
04/24/2022

Realistic Evaluation of Transductive Few-Shot Learning

Transductive inference is widely used in few-shot learning, as it levera...
research
07/19/2020

One-Shot Learning for Language Modelling

Humans can infer a great deal about the meaning of a word, using the syn...
research
11/03/2022

Robust Few-shot Learning Without Using any Adversarial Samples

The high cost of acquiring and annotating samples has made the `few-shot...
research
10/06/2021

On the Importance of Firth Bias Reduction in Few-Shot Classification

Learning accurate classifiers for novel categories from very few example...
research
07/07/2022

Diagnosing and Remedying Shot Sensitivity with Cosine Few-Shot Learners

Few-shot recognition involves training an image classifier to distinguis...

Please sign up or login with your details

Forgot password? Click here to reset