Bayesian Regularization on Function Spaces via Q-Exponential Process

10/14/2022
by   Shiwei Lan, et al.
0

Regularization is one of the most important topics in optimization, statistics and machine learning. To get sparsity in estimating a parameter u∈^d, an ℓ_q penalty term, ‖ u‖_q, is usually added to the objective function. What is the probabilistic distribution corresponding to such ℓ_q penalty? What is the correct stochastic process corresponding to ‖ u‖_q when we model functions u∈ L^q? This is important for statistically modeling large dimensional objects, e.g. images, with penalty to preserve certainty properties, e.g. edges in the image. In this work, we generalize the q-exponential distribution (with density proportional to) exp(- |u|^q) to a stochastic process named Q-exponential (Q-EP) process that corresponds to the L_q regularization of functions. The key step is to specify consistent multivariate q-exponential distributions by choosing from a large family of elliptic contour distributions. The work is closely related to Besov process which is usually defined by the expanded series. Q-EP can be regarded as a definition of Besov process with explicit probabilistic formulation and direct control on the correlation length. From the Bayesian perspective, Q-EP provides a flexible prior on functions with sharper penalty (q<2) than the commonly used Gaussian process (GP). We compare GP, Besov and Q-EP in modeling time series and reconstructing images and demonstrate the advantage of the proposed methodology.

READ FULL TEXT

page 2

page 8

research
01/20/2021

A Similarity Measure of Gaussian Process Predictive Distributions

Some scenarios require the computation of a predictive distribution of a...
research
06/07/2022

Relaxed Gaussian process interpolation: a goal-oriented approach to Bayesian optimization

This work presents a new procedure for obtaining predictive distribution...
research
06/09/2021

GP-ConvCNP: Better Generalization for Convolutional Conditional Neural Processes on Time Series Data

Neural Processes (NPs) are a family of conditional generative models tha...
research
07/22/2013

Kinetic Energy Plus Penalty Functions for Sparse Estimation

In this paper we propose and study a family of sparsity-inducing penalty...
research
07/25/2019

Towards Scalable Gaussian Process Modeling

Numerous engineering problems of interest to the industry are often char...
research
05/30/2019

Enriched Mixtures of Gaussian Process Experts

Mixtures of experts probabilistically divide the input space into region...
research
02/01/2020

Interpreting a Penalty as the Influence of a Bayesian Prior

In machine learning, it is common to optimize the parameters of a probab...

Please sign up or login with your details

Forgot password? Click here to reset