DeepAI AI Chat
Log In Sign Up

Bayesian hierarchical stacking

by   Yuling Yao, et al.

Stacking is a widely used model averaging technique that yields asymptotically optimal prediction among all linear averages. We show that stacking is most effective when the model predictive performance is heterogeneous in inputs, so that we can further improve the stacked mixture with a hierarchical model. With the input-varying yet partially-pooled model weights, hierarchical stacking improves average and conditional predictions. Our Bayesian formulation includes constant-weight (complete-pooling) stacking as a special case. We generalize to incorporate discrete and continuous inputs, other structured priors, and time-series and longitudinal data. We demonstrate on several applied problems.


page 1

page 2

page 3

page 4


Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights

Probabilistic neural networks are typically modeled with independent wei...

Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition

Combining predictions from different models is a central problem in Baye...

Hierarchical Graph Pooling is an Effective Citywide Traffic Condition Prediction Model

Accurate traffic conditions prediction provides a solid foundation for v...

Local Prediction Pools

We propose local prediction pools as a method for combining the predicti...

Jackknife Partially Linear Model Averaging for the Conditional Quantile Prediction

Estimating the conditional quantile of the interested variable with resp...

Combining predictions from linear models when training and test inputs differ

Methods for combining predictions from different models in a supervised ...

Hierarchical spline for time series forecasting: An application to Naval ship engine failure rate

Predicting equipment failure is important because it could improve avail...

Code Repositories


code and demo for hierarchical stacking paper

view repo