Stochastic Compositional Gradient Descent under Compositional constraints

12/17/2020
by   Srujan Teja Thomdapu, et al.
0

This work studies constrained stochastic optimization problems where the objective and constraint functions are convex and expressed as compositions of stochastic functions. The problem arises in the context of fair classification, fair regression, and the design of queuing systems. Of particular interest is the large-scale setting where an oracle provides the stochastic gradients of the constituent functions, and the goal is to solve the problem with a minimal number of calls to the oracle. The problem arises in fair classification/regression and in the design of queuing systems. Owing to the compositional form, the stochastic gradients provided by the oracle do not yield unbiased estimates of the objective or constraint gradients. Instead, we construct approximate gradients by tracking the inner function evaluations, resulting in a quasi-gradient saddle point algorithm. We prove that the proposed algorithm is guaranteed to find the optimal and feasible solution almost surely. We further establish that the proposed algorithm requires 𝒪(1/ϵ^4) data samples in order to obtain an ϵ-approximate optimal point while also ensuring zero constraint violation. The result matches the sample complexity of the stochastic compositional gradient descent method for unconstrained problems and improves upon the best-known sample complexity results for the constrained settings. The efficacy of the proposed algorithm is tested on both fair classification and fair regression problems. The numerical results show that the proposed algorithm outperforms the state-of-the-art algorithms in terms of the convergence rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2019

Optimal Design of Queuing Systems via Compositional Stochastic Programming

Well-designed queuing systems form the backbone of modern communications...
research
12/31/2019

Stochastic Recursive Variance Reduction for Efficient Smooth Non-Convex Compositional Optimization

Stochastic compositional optimization arises in many important machine l...
research
09/04/2018

Compositional Stochastic Average Gradient for Machine Learning and Related Applications

Many machine learning, statistical inference, and portfolio optimization...
research
10/24/2019

Katyusha Acceleration for Convex Finite-Sum Compositional Optimization

Structured problems arise in many applications. To solve these problems,...
research
08/13/2020

Conservative Stochastic Optimization with Expectation Constraints

This paper considers stochastic convex optimization problems where the o...
research
02/07/2018

Improved Oracle Complexity of Variance Reduced Methods for Nonsmooth Convex Stochastic Composition Optimization

We consider the nonsmooth convex composition optimization problem where ...
research
07/11/2023

Stochastic Nested Compositional Bi-level Optimization for Robust Feature Learning

We develop and analyze stochastic approximation algorithms for solving n...

Please sign up or login with your details

Forgot password? Click here to reset