A Deterministic Nonsmooth Frank Wolfe Algorithm with Coreset Guarantees

08/22/2017
by   Sathya N. Ravi, et al.
0

We present a new Frank-Wolfe (FW) type algorithm that is applicable to minimization problems with a nonsmooth convex objective. We provide convergence bounds and show that the scheme yields so-called coreset results for various Machine Learning problems including 1-median, Balanced Development, Sparse PCA, Graph Cuts, and the ℓ_1-norm-regularized Support Vector Machine (SVM) among others. This means that the algorithm provides approximate solutions to these problems in time complexity bounds that are not dependent on the size of the input problem. Our framework, motivated by a growing body of work on sublinear algorithms for various data analysis problems, is entirely deterministic and makes no use of smoothing or proximal operators. Apart from these theoretical results, we show experimentally that the algorithm is very practical and in some cases also offers significant computational advantages on large problem instances. We provide an open source implementation that can be adapted for other problems that fit the overall structure.

READ FULL TEXT
research
05/11/2018

Randomized Smoothing SVRG for Large-scale Nonsmooth Convex Optimization

In this paper, we consider the problem of minimizing the average of a la...
research
05/11/2020

A Relational Gradient Descent Algorithm For Support Vector Machine Training

We consider gradient descent like algorithms for Support Vector Machine ...
research
08/19/2019

Quantum algorithms for Second-Order Cone Programming and Support Vector Machines

Second order cone programs (SOCPs) are a class of structured convex opti...
research
03/04/2022

Sharper Bounds for Proximal Gradient Algorithms with Errors

We analyse the convergence of the proximal gradient algorithm for convex...
research
02/06/2019

Robust learning and complexity dependent bounds for regularized problems

We obtain risk bounds for Regularized Empirical Risk Minimizers (RERM) a...
research
09/18/2020

Hybrid Stochastic-Deterministic Minibatch Proximal Gradient: Less-Than-Single-Pass Optimization with Nearly Optimal Generalization

Stochastic variance-reduced gradient (SVRG) algorithms have been shown t...

Please sign up or login with your details

Forgot password? Click here to reset