MNIST Dataset Classification Utilizing k-NN Classifier with Modified Sliding Window Metric

09/18/2018
by   Behrad Toghi, et al.
0

This paper evaluates the performance of the K-nearest neighbor classification algorithm on the MNIST dataset of the handwritten digits. The L2 Euclidean distance metric is compared to a modified distance metric which utilizes the sliding window technique in order to avoid performance degradations due to slight spatial misalignments. Accuracy and confusion matrix are used as the performance indicators to compare the performance of the baseline algorithm versus the enhanced sliding window method and results show significant improvement using this simple method.

READ FULL TEXT

page 1

page 2

page 3

research
06/24/2007

Metric Embedding for Nearest Neighbor Classification

The distance metric plays an important role in nearest neighbor (NN) cla...
research
08/24/2010

A Simple CW-SSIM Kernel-based Nearest Neighbor Method for Handwritten Digit Classification

We propose a simple kernel based nearest neighbor approach for handwritt...
research
03/15/2021

Smoothness of Schatten Norms and Sliding-Window Matrix Streams

Large matrices are often accessed as a row-order stream. We consider the...
research
01/21/2021

Effect of Window Size for Detection of Abnormalities in Respiratory Sounds

The recording of respiratory sounds was of significant benefit in the di...
research
12/13/2021

Fast Single-Core K-Nearest Neighbor Graph Computation

Fast and reliable K-Nearest Neighbor Graph algorithms are more important...
research
02/22/2023

Differentially Private L_2-Heavy Hitters in the Sliding Window Model

The data management of large companies often prioritize more recent data...
research
09/18/2015

A Light Sliding-Window Part-of-Speech Tagger for the Apertium Free/Open-Source Machine Translation Platform

This paper describes a free/open-source implementation of the light slid...

Please sign up or login with your details

Forgot password? Click here to reset