Local Kernels that Approximate Bayesian Regularization and Proximal Operators
In this work, we broadly connect kernel-based filtering (e.g. approaches such as the bilateral filters and nonlocal means, but also many more) with general variational formulations of Bayesian regularized least squares, and the related concept of proximal operators. The latter set of variational/Bayesian/proximal formulations often result in optimization problems that do not have closed-form solutions, and therefore typically require global iterative solutions. Our main contribution here is to establish how one can approximate the solution of the resulting global optimization problems with use of locally adaptive filters with specific kernels. Our results are valid for small regularization strength but the approach is powerful enough to be useful for a wide range of applications because we expose how to derive a "kernelized" solution to these problems that approximates the global solution in one-shot, using only local operations. As another side benefit in the reverse direction, given a local data-adaptive filter constructed with a particular choice of kernel, we enable the interpretation of such filters in the variational/Bayesian/proximal framework.
READ FULL TEXT