Private Stochastic Convex Optimization: Efficient Algorithms for Non-smooth Objectives

02/22/2020
by   Raman Arora, et al.
0

In this paper, we revisit the problem of private stochastic convex optimization. We propose an algorithm, based on noisy mirror descent, which achieves optimal rates up to a logarithmic factor, both in terms of statistical complexity and number of queries to a first-order stochastic oracle. Unlike prior work, we do not require Lipschitz continuity of stochastic gradients to achieve optimal rates. Our algorithm generalizes beyond the Euclidean setting and yields anytime utility and privacy guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2022

Private Convex Optimization in General Norms

We propose a new framework for differentially private optimization of co...
research
01/01/2023

ReSQueing Parallel and Private Stochastic Convex Optimization

We introduce a new tool for stochastic convex optimization (SCO): a Rewe...
research
07/16/2021

Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

We propose a new family of adaptive first-order methods for a class of c...
research
05/12/2022

Optimal Methods for Higher-Order Smooth Monotone Variational Inequalities

In this work, we present new simple and optimal algorithms for solving t...
research
07/01/2019

Open Problem: The Oracle Complexity of Convex Optimization with Limited Memory

We note that known methods achieving the optimal oracle complexity for f...
research
03/02/2021

Private Stochastic Convex Optimization: Optimal Rates in ℓ_1 Geometry

Stochastic convex optimization over an ℓ_1-bounded domain is ubiquitous ...
research
02/11/2014

On Zeroth-Order Stochastic Convex Optimization via Random Walks

We propose a method for zeroth order stochastic convex optimization that...

Please sign up or login with your details

Forgot password? Click here to reset