DeepAI
Log In Sign Up

A Computationally Efficient Classification Algorithm in Posterior Drift Model: Phase Transition and Minimax Adaptivity

11/09/2020
by   Ruiqi Liu, et al.
0

In massive data analysis, training and testing data often come from very different sources, and their probability distributions are not necessarily identical. A feature example is nonparametric classification in posterior drift model where the conditional distributions of the label given the covariates are possibly different. In this paper, we derive minimax rate of the excess risk for nonparametric classification in posterior drift model in the setting that both training and testing data have smooth distributions, extending a recent work by Cai and Wei (2019) who only impose smoothness condition on the distribution of testing data. The minimax rate demonstrates a phase transition characterized by the mutual relationship between the smoothness orders of the training and testing data distributions. We also propose a computationally efficient and data-driven nearest neighbor classifier which achieves the minimax excess risk (up to a logarithm factor). Simulation studies and a real-world application are conducted to demonstrate our approach.

READ FULL TEXT
06/07/2019

Transfer Learning for Nonparametric Classification: Minimax Rate and Adaptive Classifier

Human learners have the natural ability to use knowledge gained in one s...
02/01/2019

Local minimax rates for closeness testing of discrete distributions

We consider the closeness testing (or two-sample testing) problem in the...
02/15/2018

Nonparametric Bayesian posterior contraction rates for scalar diffusions with high-frequency data

We consider inference in the scalar diffusion model dX_t=b(X_t)dt+σ(X_t)...
02/17/2018

Nonparametric Testing under Random Projection

A common challenge in nonparametric inference is its high computational ...
06/26/2020

Learning Optimal Distributionally Robust Individualized Treatment Rules

Recent development in the data-driven decision science has seen great ad...
05/26/2022

Undersampling is a Minimax Optimal Robustness Intervention in Nonparametric Classification

While a broad range of techniques have been proposed to tackle distribut...