To Boost or not to Boost: On the Limits of Boosted Neural Networks

07/28/2021
by   Sai Saketh Rambhatla, et al.
0

Boosting is a method for finding a highly accurate hypothesis by linearly combining many “weak" hypotheses, each of which may be only moderately accurate. Thus, boosting is a method for learning an ensemble of classifiers. While boosting has been shown to be very effective for decision trees, its impact on neural networks has not been extensively studied. We prove one important difference between sums of decision trees compared to sums of convolutional neural networks (CNNs) which is that a sum of decision trees cannot be represented by a single decision tree with the same number of parameters while a sum of CNNs can be represented by a single CNN. Next, using standard object recognition datasets, we verify experimentally the well-known result that a boosted ensemble of decision trees usually generalizes much better on testing data than a single decision tree with the same number of parameters. In contrast, using the same datasets and boosting algorithms, our experiments show the opposite to be true when using neural networks (both CNNs and multilayer perceptrons (MLPs)). We find that a single neural network usually generalizes better than a boosted ensemble of smaller neural networks with the same total number of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2011

Popular Ensemble Methods: An Empirical Study

An ensemble consists of a set of individually trained classifiers (such ...
research
03/03/2016

Decision Forests, Convolutional Networks and the Models in-Between

This paper investigates the connections between two state of the art cla...
research
08/18/2022

A Scalable, Interpretable, Verifiable Differentiable Logic Gate Convolutional Neural Network Architecture From Truth Tables

We propose 𝒯ruth 𝒯able net (𝒯𝒯net), a novel Convolutional Neural Network...
research
01/02/2015

An Empirical Study of the L2-Boost technique with Echo State Networks

A particular case of Recurrent Neural Network (RNN) was introduced at th...
research
06/21/2021

Regularization is all you Need: Simple Neural Nets can Excel on Tabular Data

Tabular datasets are the last "unconquered castle" for deep learning, wi...
research
06/28/2014

Exponentially Increasing the Capacity-to-Computation Ratio for Conditional Computation in Deep Learning

Many state-of-the-art results obtained with deep networks are achieved w...
research
03/08/2017

Structural Data Recognition with Graph Model Boosting

This paper presents a novel method for structural data recognition using...

Please sign up or login with your details

Forgot password? Click here to reset