Structured Radial Basis Function Network: Modelling Diversity for Multiple Hypotheses Prediction

Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions. It can be tackled with multiple hypotheses frameworks but with the difficulty of combining them efficiently in a learning model. A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems. The predictors are regression models of any type that can form centroidal Voronoi tessellations which are a function of their losses during training. It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution and is equivalent to interpolating the meta-loss of the predictors, the loss being a zero set of the interpolation error. This model has a fixed-point iteration algorithm between the predictors and the centers of the basis functions. Diversity in learning can be controlled parametrically by truncating the tessellation formation with the losses of individual predictors. A closed-form solution with least-squares is presented, which to the authors knowledge, is the fastest solution in the literature for multiple hypotheses and structured predictions. Superior generalization performance and computational efficiency is achieved using only two-layer neural networks as predictors controlling diversity as a key component of success. A gradient-descent approach is introduced which is loss-agnostic regarding the predictors. The expected value for the loss of the structured model with Gaussian basis functions is computed, finding that correlation between predictors is not an appropriate tool for diversification. The experiments show outperformance with respect to the top competitors in the literature.

READ FULL TEXT
research
12/09/2020

A partial least squares approach for function-on-function interaction regression

A partial least squares regression is proposed for estimating the functi...
research
01/11/2023

Loss-Controlling Calibration for Predictive Models

We propose a learning framework for calibrating predictive models to mak...
research
07/11/2020

Learning Randomly Perturbed Structured Predictors for Direct Loss Minimization

Direct loss minimization is a popular approach for learning predictors o...
research
06/13/2023

On Achieving Optimal Adversarial Test Error

We first elucidate various fundamental properties of optimal adversarial...
research
03/15/2021

Online Learning with Radial Basis Function Networks

We investigate the benefits of feature selection, nonlinear modelling an...
research
02/18/2013

Canonical dual solutions to nonconvex radial basis neural network optimization problem

Radial Basis Functions Neural Networks (RBFNNs) are tools widely used in...
research
11/18/2015

A New Smooth Approximation to the Zero One Loss with a Probabilistic Interpretation

We examine a new form of smooth approximation to the zero one loss in wh...

Please sign up or login with your details

Forgot password? Click here to reset