
Solutions to Sparse Multilevel Matrix Problems
We define and solve classes of sparse matrix problems that arise in mult...
read it

Adaptive Multilevel Monte Carlo for Probabilities
We consider the numerical approximation of ℙ[G∈Ω] where the ddimensiona...
read it

Sparse Linear Mixed Model Selection via Streamlined Variational Bayes
Linear mixed models are a versatile statistical tool to study data by ac...
read it

Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood
We consider probabilistic multinomial probit classification using Gaussi...
read it

Big Data vs. complex physical models: a scalable inference algorithm
The data torrent unleashed by current and upcoming instruments requires ...
read it

Fast and Accurate Estimation of NonNested Binomial Hierarchical Models Using Variational Inference
Estimating nonlinear hierarchical models can be computationally burdens...
read it

Relaxed random walks at scale
Relaxed random walk (RRW) models of trait evolution introduce branchspe...
read it
Scalable computation for Bayesian hierarchical models
The article is about algorithms for learning Bayesian hierarchical models, the computational complexity of which scales linearly with the number of observations and the number of parameters in the model. It focuses on crossed random effect and nested multilevel models, which are used ubiquitously in applied sciences, and illustrates the methodology on two challenging real data analyses on predicting electoral results and real estate prices respectively. The posterior dependence in both classes is sparse: in crossed random effects models it resembles a random graph, whereas in nested multilevel models it is treestructured. For each class we develop a framework for scalable computation based on collapsed Gibbs sampling and belief propagation respectively. We provide a number of negative (for crossed) and positive (for nested) results for the scalability (or lack thereof) of methods based on sparse linear algebra, which are relevant also to Laplace approximation methods for such models. Our numerical experiments compare with offtheshelf variational approximations and Hamiltonian Monte Carlo. Our theoretical results, although partial, are useful in suggesting interesting methodologies and lead to conclusions that our numerics suggest to hold well beyond the scope of the underlying assumptions.
READ FULL TEXT
Comments
There are no comments yet.