CIFAR-10: KNN-based Ensemble of Classifiers

11/15/2016
by   Yehya Abouelnaga, et al.
0

In this paper, we study the performance of different classifiers on the CIFAR-10 dataset, and build an ensemble of classifiers to reach a better performance. We show that, on CIFAR-10, K-Nearest Neighbors (KNN) and Convolutional Neural Network (CNN), on some classes, are mutually exclusive, thus yield in higher accuracy when combined. We reduce KNN overfitting using Principal Component Analysis (PCA), and ensemble it with a CNN to increase its accuracy. Our approach improves our best CNN model from 93.33

READ FULL TEXT

page 2

page 3

research
12/22/2020

Training Convolutional Neural Networks With Hebbian Principal Component Analysis

Recent work has shown that biologically plausible Hebbian learning can b...
research
08/15/2019

Automated classification of plasma regions using 3D particle energy distribution

Even though automatic classification and interpretation would be highly ...
research
02/07/2021

Sparsely ensembled convolutional neural network classifiers via reinforcement learning

We consider convolutional neural network (CNN) ensemble learning with th...
research
06/25/2019

Naver at ActivityNet Challenge 2019 -- Task B Active Speaker Detection (AVA)

This report describes our submission to the ActivityNet Challenge at CVP...
research
09/13/2022

Test-Time Adaptation with Principal Component Analysis

Machine Learning models are prone to fail when test data are different f...
research
06/01/2018

Do CIFAR-10 Classifiers Generalize to CIFAR-10?

Machine learning is currently dominated by largely experimental work foc...
research
12/09/2021

Model Doctor: A Simple Gradient Aggregation Strategy for Diagnosing and Treating CNN Classifiers

Recently, Convolutional Neural Network (CNN) has achieved excellent perf...

Please sign up or login with your details

Forgot password? Click here to reset