Multi-hypothesis classifier

08/20/2019
by   Sayantan Sengupta, et al.
0

Accuracy is the most important parameter among few others which defines the effectiveness of a machine learning algorithm. Higher accuracy is always desirable. Now, there is a vast number of well established learning algorithms already present in the scientific domain. Each one of them has its own merits and demerits. Merits and demerits are evaluated in terms of accuracy, speed of convergence, complexity of the algorithm, generalization property, and robustness among many others. Also the learning algorithms are data-distribution dependent. Each learning algorithm is suitable for a particular distribution of data. Unfortunately, no dominant classifier exists for all the data distribution, and the data distribution task at hand is usually unknown. Not one classifier can be discriminative well enough if the number of classes are huge. So the underlying problem is that a single classifier is not enough to classify the whole sample space correctly. This thesis is about exploring the different techniques of combining the classifiers so as to obtain the optimal accuracy. Three classifiers are implemented namely plain old nearest neighbor on raw pixels, a structural feature extracted neighbor and Gabor feature extracted nearest neighbor. Five different combination strategies are devised and tested on Tibetan character images and analyzed

READ FULL TEXT

page 16

page 23

research
05/26/2014

Stabilized Nearest Neighbor Classifier and Its Statistical Properties

The stability of statistical analysis is an important indicator for repr...
research
08/16/2023

Two Phases of Scaling Laws for Nearest Neighbor Classifiers

A scaling law refers to the observation that the test performance of a m...
research
11/19/2022

A Two-Stage Active Learning Algorithm for k-Nearest Neighbors

We introduce a simple and intuitive two-stage active learning algorithm ...
research
10/15/2020

On Convergence of Nearest Neighbor Classifiers over Feature Transformations

The k-Nearest Neighbors (kNN) classifier is a fundamental non-parametric...
research
02/04/2021

Instance-based learning using the Half-Space Proximal Graph

The primary example of instance-based learning is the k-nearest neighbor...
research
11/18/2011

Multi-font Multi-size Kannada Numeral Recognition Based on Structural Features

In this paper a fast and novel method is proposed for multi-font multi-s...
research
11/18/2011

A Single Euler Number Feature for Multi-font Multi-size Kannada Numeral Recognition

In this paper a novel approach is proposed based on single Euler number ...

Please sign up or login with your details

Forgot password? Click here to reset