Sample Efficient Model Evaluation

09/24/2021
by   Emine Yilmaz, et al.
0

Labelling data is a major practical bottleneck in training and testing classifiers. Given a collection of unlabelled data points, we address how to select which subset to label to best estimate test metrics such as accuracy, F_1 score or micro/macro F_1. We consider two sampling based approaches, namely the well-known Importance Sampling and we introduce a novel application of Poisson Sampling. For both approaches we derive the minimal error sampling distributions and how to approximate and use them to form estimators and confidence intervals. We show that Poisson Sampling outperforms Importance Sampling both theoretically and experimentally.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2016

Importance Sampling with Unequal Support

Importance sampling is often used in machine learning when training and ...
research
06/12/2020

A general framework for label-efficient online evaluation with asymptotic guarantees

Achieving statistically significant evaluation with passive sampling of ...
research
06/09/2017

Class-specific Poisson denoising by patch-based importance sampling

In this paper, we address the problem of recovering images degraded by P...
research
09/05/2021

Robust Importance Sampling for Error Estimation in the Context of Optimal Bayesian Transfer Learning

Classification has been a major task for building intelligent systems as...
research
09/13/2021

Low-Shot Validation: Active Importance Sampling for Estimating Classifier Performance on Rare Categories

For machine learning models trained with limited labeled training data, ...
research
06/05/2023

DISCount: Counting in Large Image Collections with Detector-Based Importance Sampling

Many modern applications use computer vision to detect and count objects...
research
06/14/2020

Support Estimation with Sampling Artifacts and Errors

The problem of estimating the support of a distribution is of great impo...

Please sign up or login with your details

Forgot password? Click here to reset