The Attraction Indian Buffet Distribution

06/09/2021
by   Richard L. Warr, et al.
0

We propose the attraction Indian buffet distribution (AIBD), a distribution for binary feature matrices influenced by pairwise similarity information. Binary feature matrices are used in Bayesian models to uncover latent variables (i.e., features) that explain observed data. The Indian buffet process (IBP) is a popular exchangeable prior distribution for latent feature matrices. In the presence of additional information, however, the exchangeability assumption is not reasonable or desirable. The AIBD can incorporate pairwise similarity information, yet it preserves many properties of the IBP, including the distribution of the total number of features. Thus, much of the interpretation and intuition that one has for the IBP directly carries over to the AIBD. A temperature parameter controls the degree to which the similarity information affects feature-sharing between observations. Unlike other nonexchangeable distributions for feature allocations, the probability mass function of the AIBD has a tractable normalizing constant, making posterior inference on hyperparameters straight-forward using standard MCMC methods. A novel posterior sampling algorithm is proposed for the IBP and the AIBD. We demonstrate the feasibility of the AIBD as a prior distribution in feature allocation models and compare the performance of competing methods in simulations and an application.

READ FULL TEXT

page 19

page 20

research
02/13/2023

Inference of multiple high-dimensional networks with the Graphical Horseshoe prior

We develop a novel full-Bayesian approach for multiple correlated precis...
research
07/27/2022

Comparison and Bayesian Estimation of Feature Allocations

Feature allocation models postulate a sampling distribution whose parame...
research
01/25/2020

Particle-Gibbs Sampling For Bayesian Feature Allocation Models

Bayesian feature allocation models are a popular tool for modelling data...
research
09/26/2013

The Supervised IBP: Neighbourhood Preserving Infinite Latent Feature Models

We propose a probabilistic model to infer supervised latent variables in...
research
09/13/2022

Unsupervised representational learning with recognition-parametrised probabilistic models

We introduce a new approach to probabilistic unsupervised learning based...
research
03/30/2020

Non-exchangeable feature allocation models with sublinear growth of the feature sizes

Feature allocation models are popular models used in different applicati...
research
04/11/2023

Bayesian Analysis of Generalized Hierarchical Indian Buffet Processes for Within and Across Group Sharing of Latent Features

Bayesian nonparametric hierarchical priors provide flexible models for s...

Please sign up or login with your details

Forgot password? Click here to reset