Provable Constrained Stochastic Convex Optimization with XOR-Projected Gradient Descent

03/22/2022
by   Fan Ding, et al.
0

Provably solving stochastic convex optimization problems with constraints is essential for various problems in science, business, and statistics. Recently proposed XOR-Stochastic Gradient Descent (XOR-SGD) provides a convergence rate guarantee solving the constraints-free version of the problem by leveraging XOR-Sampling. However, the task becomes more difficult when additional equality and inequality constraints are needed to be satisfied. Here we propose XOR-PGD, a novel algorithm based on Projected Gradient Descent (PGD) coupled with the XOR sampler, which is guaranteed to solve the constrained stochastic convex optimization problem still in linear convergence rate by choosing proper step size. We show on both synthetic stochastic inventory management and real-world road network design problems that the rate of constraints satisfaction of the solutions optimized by XOR-PGD is 10% more than the competing approaches in a very large searching space. The improved XOR-PGD algorithm is demonstrated to be more accurate and efficient than both XOR-SGD and SGD coupled with MCMC based samplers. It is also shown to be more scalable with respect to the number of samples and processor cores via experiments with large dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2021

Online Statistical Inference for Parameters Estimation with Linear-Equality Constraints

Stochastic gradient descent (SGD) and projected stochastic gradient desc...
research
11/12/2015

Random Multi-Constraint Projection: Stochastic Gradient Methods for Convex Optimization with Many Constraints

Consider convex optimization problems subject to a large number of const...
research
04/19/2013

Optimal Stochastic Strongly Convex Optimization with a Logarithmic Number of Projections

We consider stochastic strongly convex optimization with a complex inequ...
research
06/08/2020

The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization

The extrapolation strategy raised by Nesterov, which can accelerate the ...
research
06/10/2022

Stochastic Zeroth order Descent with Structured Directions

We introduce and analyze Structured Stochastic Zeroth order Descent (S-S...
research
10/26/2017

Improving Negative Sampling for Word Representation using Self-embedded Features

Although the word-popularity based negative sampler has shown superb per...
research
01/22/2022

An Unsupervised Deep Unrolling Framework for Constrained Optimization Problems in Wireless Networks

In wireless network, the optimization problems generally have complex co...

Please sign up or login with your details

Forgot password? Click here to reset