Bayesian Regularization on Function Spaces via Q-Exponential Process

10/14/2022
by   Shiwei Lan, et al.
0

Regularization is one of the most important topics in optimization, statistics and machine learning. To get sparsity in estimating a parameter u∈^d, an ℓ_q penalty term, ‖ u‖_q, is usually added to the objective function. What is the probabilistic distribution corresponding to such ℓ_q penalty? What is the correct stochastic process corresponding to ‖ u‖_q when we model functions u∈ L^q? This is important for statistically modeling large dimensional objects, e.g. images, with penalty to preserve certainty properties, e.g. edges in the image. In this work, we generalize the q-exponential distribution (with density proportional to) exp(- |u|^q) to a stochastic process named Q-exponential (Q-EP) process that corresponds to the L_q regularization of functions. The key step is to specify consistent multivariate q-exponential distributions by choosing from a large family of elliptic contour distributions. The work is closely related to Besov process which is usually defined by the expanded series. Q-EP can be regarded as a definition of Besov process with explicit probabilistic formulation and direct control on the correlation length. From the Bayesian perspective, Q-EP provides a flexible prior on functions with sharper penalty (q<2) than the commonly used Gaussian process (GP). We compare GP, Besov and Q-EP in modeling time series and reconstructing images and demonstrate the advantage of the proposed methodology.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset