Bayesian Model Selection Methods for Mutual and Symmetric k-Nearest Neighbor Classification

08/14/2016
by   Hyun-Chul Kim, et al.
0

The k-nearest neighbor classification method (k-NNC) is one of the simplest nonparametric classification methods. The mutual k-NN classification method (MkNNC) is a variant of k-NNC based on mutual neighborship. We propose another variant of k-NNC, the symmetric k-NN classification method (SkNNC) based on both mutual neighborship and one-sided neighborship. The performance of MkNNC and SkNNC depends on the parameter k as the one of k-NNC does. We propose the ways how MkNN and SkNN classification can be performed based on Bayesian mutual and symmetric k-NN regression methods with the selection schemes for the parameter k. Bayesian mutual and symmetric k-NN regression methods are based on Gaussian process models, and it turns out that they can do MkNN and SkNN classification with new encodings of target values (class labels). The simulation results show that the proposed methods are better than or comparable to k-NNC, MkNNC and SkNNC with the parameter k selected by the leave-one-out cross validation method not only for an artificial data set but also for real world data sets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro