Stein Points

03/27/2018
by   Wilson Ye Chen, et al.
0

An important task in computational statistics and machine learning is to approximate a posterior distribution p(x) with an empirical measure supported on a set of representative points {x_i}_i=1^n. This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when n is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and p(x). Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.

READ FULL TEXT
research
05/09/2019

Stein Point Markov Chain Monte Carlo

An important task in machine learning and statistics is the approximatio...
research
10/22/2020

Measure Transport with Kernel Stein Discrepancy

Measure transport underpins several recent algorithms for posterior appr...
research
01/19/2021

Performance analysis of greedy algorithms for minimising a Maximum Mean Discrepancy

We analyse the performance of several iterative algorithms for the quant...
research
03/18/2021

SPOT: A framework for selection of prototypes using optimal transport

In this work, we develop an optimal transport (OT) based framework to se...
research
10/11/2018

A Riemannian-Stein Kernel Method

This paper presents a theoretical analysis of numerical integration base...
research
11/22/2021

Approximate Bayesian Computation via Classification

Approximate Bayesian Computation (ABC) enables statistical inference in ...
research
03/04/2020

The empirical Christoffel function with applications in data analysis

We illustrate the potential applications in machine learning of the Chri...

Please sign up or login with your details

Forgot password? Click here to reset