Faster Rates for the Frank-Wolfe Algorithm Using Jacobi Polynomials

10/19/2021
by   Robin Francis, et al.
0

The Frank Wolfe algorithm (FW) is a popular projection-free alternative for solving large-scale constrained optimization problems. However, the FW algorithm suffers from a sublinear convergence rate when minimizing a smooth convex function over a compact convex set. Thus, exploring techniques that yield a faster convergence rate becomes crucial. A classic approach to obtain faster rates is to combine previous iterates to obtain the next iterate. In this work, we extend this approach to the FW setting and show that the optimal way to combine the past iterates is using a set of orthogonal Jacobi polynomials. We also a polynomial-based acceleration technique, referred to as Jacobi polynomial accelerated FW, which combines the current iterate with the past iterate using combing weights related to the Jacobi recursion. By carefully choosing parameters of the Jacobi polynomials, we obtain a faster sublinear convergence rate. We provide numerical experiments on real datasets to demonstrate the efficacy of the proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2016

Proximal Quasi-Newton Methods for Regularized Convex Optimization with Linear and Accelerated Sublinear Convergence Rates

In [19], a general, inexact, efficient proximal quasi-Newton algorithm f...
research
08/02/2020

On the optimal rates of convergence of Gegenbauer projections

In this paper we present a comprehensive convergence rate analysis of Ge...
research
06/02/2020

On optimal convergence rates of spectral orthogonal projection approximation for functions of algbraic and logarithmatic regularities

Based on the Hilb type formula between Jacobi polynomials and Bessel fun...
research
06/02/2020

Convergence rates of spectral orthogonal projection approximation for functions of algebraic and logarithmatic regularities

Based on the Hilb type formula between Jacobi polynomials and Bessel fun...
research
05/05/2020

Manifold Proximal Point Algorithms for Dual Principal Component Pursuit and Orthogonal Dictionary Learning

We consider the problem of maximizing the ℓ_1 norm of a linear map over ...
research
12/09/2020

Enhancing Parameter-Free Frank Wolfe with an Extra Subproblem

Aiming at convex optimization under structural constraints, this work in...
research
01/05/2021

Modified discrete Laguerre polynomials for efficient computation of exponentially bounded Matsubara sums

We develop a new type of orthogonal polynomial, the modified discrete La...

Please sign up or login with your details

Forgot password? Click here to reset