Nearest Neighbor-based Importance Weighting

02/03/2021
by   Marco Loog, et al.
0

Importance weighting is widely applicable in machine learning in general and in techniques dealing with data covariate shift problems in particular. A novel, direct approach to determine such importance weighting is presented. It relies on a nearest neighbor classification scheme and is relatively straightforward to implement. Comparative experiments on various classification tasks demonstrate the effectiveness of our so-called nearest neighbor weighting (NNeW) scheme. Considering its performance, our procedure can act as a simple and effective baseline method for importance weighting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2019

Weak consistency of the 1-nearest neighbor measure with applications to missing data and covariate shift

When data is partially missing at random, imputation and importance weig...
research
08/09/2011

Uncertain Nearest Neighbor Classification

This work deals with the problem of classifying uncertain data. With thi...
research
06/01/2021

Weighting vectors for machine learning: numerical harmonic analysis applied to boundary detection

Metric space magnitude, an active field of research in algebraic topolog...
research
02/07/2022

Moving Other Way: Exploring Word Mover Distance Extensions

The word mover's distance (WMD) is a popular semantic similarity metric ...
research
02/11/2019

Nearest Neighbor Median Shift Clustering for Binary Data

We describe in this paper the theory and practice behind a new modal clu...
research
05/15/2022

Attack vs Benign Network Intrusion Traffic Classification

Intrusion detection systems (IDS) are used to monitor networks or system...
research
10/29/2020

How do Offline Measures for Exploration in Reinforcement Learning behave?

Sufficient exploration is paramount for the success of a reinforcement l...

Please sign up or login with your details

Forgot password? Click here to reset