Log In Sign Up

Eliminating Multicollinearity Issues in Neural Network Ensembles: Incremental, Negatively Correlated, Optimal Convex Blending

by   Pola Lydia Lagari, et al.

Given a features, target dataset, we introduce an incremental algorithm that constructs an aggregate regressor, using an ensemble of neural networks. It is well known that ensemble methods suffer from the multicollinearity issue, which is the manifestation of redundancy arising mainly due to the common training-dataset. In the present incremental approach, at each stage we optimally blend the aggregate regressor with a newly trained neural network under a convexity constraint which, if necessary, induces negative correlations. Under this framework, collinearity issues do not arise at all, rendering so the method both accurate and robust.


page 1

page 2

page 3

page 4


Neural network ensembles: Evaluation of aggregation algorithms

Ensembles of artificial neural networks show improved generalization cap...

Deep Incremental Boosting

This paper introduces Deep Incremental Boosting, a new technique derived...

Collaborative Method for Incremental Learning on Classification and Generation

Although well-trained deep neural networks have shown remarkable perform...

Incremental inference of collective graphical models

We consider incremental inference problems from aggregate data for colle...

Incremental Deep Neural Network Learning using Classification Confidence Thresholding

Most modern neural networks for classification fail to take into account...

DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier

In this era of digital information explosion, an abundance of data from ...

Efficient Facial Feature Learning with Wide Ensemble-based Convolutional Neural Networks

Ensemble methods, traditionally built with independently trained de-corr...