A Data Efficient and Feasible Level Set Method for Stochastic Convex Optimization with Expectation Constraints

08/07/2019
by   Qihang Lin, et al.
0

Stochastic convex optimization problems with expectation constraints (SOECs) are encountered in statistics and machine learning, business, and engineering. In data-rich environments, the SOEC objective and constraints contain expectations defined with respect to large datasets. Therefore, efficient algorithms for solving such SOECs need to limit the fraction of data points that they use, which we refer to as algorithmic data complexity. Recent stochastic first order methods exhibit low data complexity when handling SOECs but guarantee near-feasibility and near-optimality only at convergence. These methods may thus return highly infeasible solutions when heuristically terminated, as is often the case, due to theoretical convergence criteria being highly conservative. This issue limits the use of first order methods in several applications where the SOEC constraints encode implementation requirements. We design a stochastic feasible level set method (SFLS) for SOECs that has low data complexity and emphasizes feasibility before convergence. Specifically, our level-set method solves a root-finding problem by calling a novel first order oracle that computes a stochastic upper bound on the level-set function by extending mirror descent and online validation techniques. We establish that SFLS maintains a high-probability feasible solution at each root-finding iteration and exhibits favorable iteration complexity compared to state-of-the-art deterministic feasible level set and stochastic subgradient methods. Numerical experiments on three diverse applications validate the low data complexity of SFLS relative to the former approach and highlight how SFLS finds feasible solutions with small optimality gaps significantly faster than the latter method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2015

Random Multi-Constraint Projection: Stochastic Gradient Methods for Convex Optimization with Many Constraints

Consider convex optimization problems subject to a large number of const...
research
08/13/2020

Conservative Stochastic Optimization with Expectation Constraints

This paper considers stochastic convex optimization problems where the o...
research
02/22/2023

Feasible Recourse Plan via Diverse Interpolation

Explaining algorithmic decisions and recommending actionable feedback is...
research
04/07/2023

A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints

Nonsmooth composite optimization with orthogonality constraints has a br...
research
06/08/2019

Optimal Convergence for Stochastic Optimization with Multiple Expectation Constraints

In this paper, we focus on the problem of stochastic optimization where ...
research
07/17/2021

On Constraints in First-Order Optimization: A View from Non-Smooth Dynamical Systems

We introduce a class of first-order methods for smooth constrained optim...
research
07/04/2016

Accelerated Stochastic Subgradient Methods under Local Error Bound Condition

In this paper, we propose two accelerated stochastic subgradient method...

Please sign up or login with your details

Forgot password? Click here to reset