Twin Neural Network Regression

12/29/2020
by   Sebastian J. Wetzel, et al.
0

We introduce twin neural network (TNN) regression. This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNN regression intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNN regression. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared to other state-of-the-art methods. Furthermore, TNN regression is constrained by self-consistency conditions. We find that the violation of these conditions provides an estimate for the prediction uncertainty.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

Twin Neural Network Regression is a Semi-Supervised Regression Algorithm

Twin neural network regression (TNNR) is a semi-supervised regression al...
research
01/03/2023

How to get the most out of Twinned Regression Methods

Twinned regression methods are designed to solve the dual problem to the...
research
02/28/2023

Toward Robust Uncertainty Estimation with Random Activation Functions

Deep neural networks are in the limelight of machine learning with their...
research
05/18/2023

Uncertainty Quantification in Deep Neural Networks through Statistical Inference on Latent Space

Uncertainty-quantification methods are applied to estimate the confidenc...
research
03/28/2014

Systematic Ensemble Learning for Regression

The motivation of this work is to improve the performance of standard st...
research
07/29/2017

KNN Ensembles for Tweedie Regression: The Power of Multiscale Neighborhoods

Very few K-nearest-neighbor (KNN) ensembles exist, despite the efficacy ...
research
02/12/2021

Learning Deep Neural Networks under Agnostic Corrupted Supervision

Training deep neural models in the presence of corrupted supervision is ...

Please sign up or login with your details

Forgot password? Click here to reset