Private optimization in the interpolation regime: faster rates and hardness results

10/31/2022
by   Hilal Asi, et al.
0

In non-private stochastic convex optimization, stochastic gradient methods converge much faster on interpolation problems – problems where there exists a solution that simultaneously minimizes all of the sample losses – than on non-interpolating ones; we show that generally similar improvements are impossible in the private setting. However, when the functions exhibit quadratic growth around the optimum, we show (near) exponential improvements in the private sample complexity. In particular, we propose an adaptive algorithm that improves the sample complexity to achieve expected error α from d/ε√(α) to 1/α^ρ + d/εlog(1/α) for any fixed ρ >0, while retaining the standard minimax-optimal sample complexity for non-interpolation problems. We prove a lower bound that shows the dimension-dependent term is tight. Furthermore, we provide a superefficiency result which demonstrates the necessity of the polynomial term for adaptive algorithms: any algorithm that has a polylogarithmic sample complexity for interpolation problems cannot achieve the minimax-optimal rates for the family of non-interpolation problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2021

Adapting to Function Difficulty and Growth Conditions in Private Optimization

We develop algorithms for private stochastic convex optimization that ad...
research
10/13/2020

Gradient Descent Ascent for Min-Max Problems on Riemannian Manifold

In the paper, we study a class of useful non-convex minimax optimization...
research
08/18/2023

Breaking the Complexity Barrier in Compositional Minimax Optimization

Compositional minimax optimization is a pivotal yet under-explored chall...
research
07/09/2021

Optimal Gradient-based Algorithms for Non-concave Bandit Optimization

Bandit problems with linear or concave reward have been extensively stud...
research
06/30/2021

AdaGDA: Faster Adaptive Gradient Descent Ascent Methods for Minimax Optimization

In the paper, we propose a class of faster adaptive gradient descent asc...
research
02/07/2022

Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start

We analyze a general class of bilevel problems, in which the upper-level...

Please sign up or login with your details

Forgot password? Click here to reset