A Multiple Classifier Approach for Concatenate-Designed Neural Networks

01/14/2021
by   Ka-Hou Chan, et al.
0

This article introduces a multiple classifier method to improve the performance of concatenate-designed neural networks, such as ResNet and DenseNet, with the purpose to alleviate the pressure on the final classifier. We give the design of the classifiers, which collects the features produced between the network sets, and present the constituent layers and the activation function for the classifiers, to calculate the classification score of each classifier. We use the L2 normalization method to obtain the classifier score instead of the Softmax normalization. We also determine the conditions that can enhance convergence. As a result, the proposed classifiers are able to improve the accuracy in the experimental cases significantly, and show that the method not only has better performance than the original models, but also produces faster convergence. Moreover, our classifiers are general and can be applied to all classification related concatenate-designed network models.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

03/22/2020

TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks

Lightweight or mobile neural networks used for real-time computer vision...
10/09/2017

Does Normalization Methods Play a Role for Hyperspectral Image Classification?

For Hyperspectral image (HSI) datasets, each class have their salient fe...
11/21/2019

MSD: Multi-Self-Distillation Learning via Multi-classifiers within Deep Neural Networks

As the development of neural networks, more and more deep neural network...
07/13/2017

Be Careful What You Backpropagate: A Case For Linear Output Activations & Gradient Boosting

In this work, we show that saturating output activation functions, such ...
05/29/2018

On Robust Trimming of Bayesian Network Classifiers

This paper considers the problem of removing costly features from a Baye...
10/28/2019

Generative Well-intentioned Networks

We propose Generative Well-intentioned Networks (GWINs), a novel framewo...
06/19/2020

Analyzing the Real-World Applicability of DGA Classifiers

Separating benign domains from domains generated by DGAs with the help o...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.