Binary output layer of feedforward neural networks for solving multi-class classification problems

01/22/2018
by   Sibo Yang, et al.
0

Considered in this short note is the design of output layer nodes of feedforward neural networks for solving multi-class classification problems with r (bigger than or equal to 3) classes of samples. The common and conventional setting of output layer, called "one-to-one approach" in this paper, is as follows: The output layer contains r output nodes corresponding to the r classes. And for an input sample of the i-th class, the ideal output is 1 for the i-th output node, and 0 for all the other output nodes. We propose in this paper a new "binary approach": Suppose r is (2^(q minus 1), 2^q] with q bigger than or equal to 2, then we let the output layer contain q output nodes, and let the ideal outputs for the r classes be designed in a binary manner. Numerical experiments carried out in this paper show that our binary approach does equally good job as, but uses less output nodes than, the traditional one-to-one approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2020

A priori estimates for classification problems using neural networks

We consider binary and multi-class classification problems using hypothe...
research
09/15/2022

Upper bounds on the Natarajan dimensions of some function classes

The Natarajan dimension is a fundamental tool for characterizing multi-c...
research
02/01/2018

A Modified Sigma-Pi-Sigma Neural Network with Adaptive Choice of Multinomials

Sigma-Pi-Sigma neural networks (SPSNNs) as a kind of high-order neural n...
research
03/16/2022

Multi Expression Programming for solving classification problems

Multi Expression Programming (MEP) is a Genetic Programming variant whic...
research
09/01/2022

CPS Attack Detection under Limited Local Information in Cyber Security: A Multi-node Multi-class Classification Ensemble Approach

Cybersecurity breaches are the common anomalies for distributed cyber-ph...
research
09/02/2018

On overcoming the Curse of Dimensionality in Neural Networks

Let H be a reproducing Kernel Hilbert space. For i=1,...,N, let x_i∈R^d ...
research
09/27/2015

Representation Benefits of Deep Feedforward Networks

This note provides a family of classification problems, indexed by a pos...

Please sign up or login with your details

Forgot password? Click here to reset