Sampling in Constrained Domains with Orthogonal-Space Variational Gradient Descent

10/12/2022
by   Ruqi Zhang, et al.
10

Sampling methods, as important inference and learning techniques, are typically designed for unconstrained domains. However, constraints are ubiquitous in machine learning problems, such as those on safety, fairness, robustness, and many other properties that must be satisfied to apply sampling results in real-life applications. Enforcing these constraints often leads to implicitly-defined manifolds, making efficient sampling with constraints very challenging. In this paper, we propose a new variational framework with a designed orthogonal-space gradient flow (O-Gradient) for sampling on a manifold 𝒢_0 defined by general equality constraints. O-Gradient decomposes the gradient into two parts: one decreases the distance to 𝒢_0 and the other decreases the KL divergence in the orthogonal space. While most existing manifold sampling methods require initialization on 𝒢_0, O-Gradient does not require such prior knowledge. We prove that O-Gradient converges to the target constrained distribution with rate O(1/the number of iterations) under mild conditions. Our proof relies on a new Stein characterization of conditional measure which could be of independent interest. We implement O-Gradient through both Langevin dynamics and Stein variational gradient descent and demonstrate its effectiveness in various experiments, including Bayesian deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2021

Sampling with Mirrored Stein Operators

We introduce a new family of particle evolution samplers suitable for co...
research
04/25/2017

Stein Variational Gradient Descent as Gradient Flow

Stein variational gradient descent (SVGD) is a deterministic sampling al...
research
10/24/2022

Sampling with Mollified Interaction Energy Descent

Sampling from a target measure whose density is only known up to a norma...
research
05/24/2023

Learning Rate Free Bayesian Inference in Constrained Domains

We introduce a suite of new particle-based algorithms for sampling on co...
research
05/27/2022

HOUDINI: Escaping from Moderately Constrained Saddles

We give the first polynomial time algorithms for escaping from high-dime...
research
10/02/2022

Improved Stein Variational Gradient Descent with Importance Weights

Stein Variational Gradient Descent (SVGD) is a popular sampling algorith...
research
03/01/2021

Information-geometry of physics-informed statistical manifolds and its use in data assimilation

The data-aware method of distributions (DA-MD) is a low-dimension data a...

Please sign up or login with your details

Forgot password? Click here to reset