Optimal Algorithms for Stochastic Bilevel Optimization under Relaxed Smoothness Conditions

06/21/2023
by   Xuxing Chen, et al.
0

Stochastic Bilevel optimization usually involves minimizing an upper-level (UL) function that is dependent on the arg-min of a strongly-convex lower-level (LL) function. Several algorithms utilize Neumann series to approximate certain matrix inverses involved in estimating the implicit gradient of the UL function (hypergradient). The state-of-the-art StOchastic Bilevel Algorithm (SOBA) [16] instead uses stochastic gradient descent steps to solve the linear system associated with the explicit matrix inversion. This modification enables SOBA to match the lower bound of sample complexity for the single-level counterpart in non-convex settings. Unfortunately, the current analysis of SOBA relies on the assumption of higher-order smoothness for the UL and LL functions to achieve optimality. In this paper, we introduce a novel fully single-loop and Hessian-inversion-free algorithmic framework for stochastic bilevel optimization and present a tighter analysis under standard smoothness assumptions (first-order Lipschitzness of the UL function and second-order Lipschitzness of the LL function). Furthermore, we show that by a slight modification of our approach, our algorithm can handle a more general multi-objective robust bilevel optimization problem. For this case, we obtain the state-of-the-art oracle complexity results demonstrating the generality of both the proposed algorithmic and analytic frameworks. Numerical experiments demonstrate the performance gain of the proposed algorithms over existing ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2023

Projection-Free Methods for Stochastic Simple Bilevel Optimization with Convex Lower-level Problem

In this paper, we study a class of stochastic bilevel optimization probl...
research
03/08/2021

On the Oracle Complexity of Higher-Order Smooth Non-Convex Finite-Sum Optimization

We prove lower bounds for higher-order methods in smooth non-convex fini...
research
05/05/2021

Randomized Stochastic Variance-Reduced Methods for Stochastic Bilevel Optimization

In this paper, we consider non-convex stochastic bilevel optimization (S...
research
06/14/2020

Exploiting Higher Order Smoothness in Derivative-free Optimization and Continuous Bandits

We study the problem of zero-order optimization of a strongly convex fun...
research
10/22/2021

Projection-Free Algorithm for Stochastic Bi-level Optimization

This work presents the first projection-free algorithm to solve stochast...
research
10/23/2022

Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee

We propose and analyze exact and inexact regularized Newton-type methods...
research
02/20/2023

Private (Stochastic) Non-Convex Optimization Revisited: Second-Order Stationary Points and Excess Risks

We consider the problem of minimizing a non-convex objective while prese...

Please sign up or login with your details

Forgot password? Click here to reset