Ensemble Learning using Error Correcting Output Codes: New Classification Error Bounds

09/18/2021
by   Hieu D. Nguyen, et al.
0

New bounds on classification error rates for the error-correcting output code (ECOC) approach in machine learning are presented. These bounds have exponential decay complexity with respect to codeword length and theoretically validate the effectiveness of the ECOC approach. Bounds are derived for two different models: the first under the assumption that all base classifiers are independent and the second under the assumption that all base classifiers are mutually correlated up to first-order. Moreover, we perform ECOC classification on six datasets and compare their error rates with our bounds to experimentally validate our work and show the effect of correlation on classification accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2023

Existence of Pauli-like stabilizers for every quantum error-correcting code

The Pauli stabilizer formalism is perhaps the most thoroughly studied me...
research
05/09/2023

On the Number of t-Lee-Error-Correcting Codes

We consider t-Lee-error-correcting codes of length n over the residue ri...
research
10/15/2020

Entropic proofs of Singleton bounds for quantum error-correcting codes

We show that a relatively simple reasoning using von Neumann entropy ine...
research
02/05/2021

Function-Correcting Codes

Motivated by applications in machine learning and archival data storage,...
research
05/20/1999

Linear and Order Statistics Combiners for Pattern Classification

Several researchers have experimentally shown that substantial improveme...
research
03/31/2015

Improved Error Bounds Based on Worst Likely Assignments

Error bounds based on worst likely assignments use permutation tests to ...
research
11/30/2019

Error-Correcting Neural Network

Error-correcting output codes (ECOC) is an ensemble method combining a s...

Please sign up or login with your details

Forgot password? Click here to reset