A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

10/04/2016
by   Hongzhou Lin, et al.
0

We propose a generic approach to accelerate gradient-based optimization algorithms with quasi-Newton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2019

A Stochastic Quasi-Newton Method with Nesterov's Accelerated Gradient

Incorporating second order curvature information in gradient based metho...
research
02/08/2019

A Smoother Way to Train Structured Prediction Models

We present a framework to train a structured prediction model by perform...
research
08/03/2016

Fast and Simple Optimization for Poisson Likelihood Models

Poisson likelihood models have been prevalently used in imaging, social ...
research
09/09/2023

A Gentle Introduction to Gradient-Based Optimization and Variational Inequalities for Machine Learning

The rapid progress in machine learning in recent years has been based on...
research
06/01/2023

Improving Energy Conserving Descent for Machine Learning: Theory and Practice

We develop the theory of Energy Conserving Descent (ECD) and introduce E...
research
06/12/2018

An Extension of Averaged-Operator-Based Algorithms

Many of the algorithms used to solve minimization problems with sparsity...
research
01/15/2022

Quasi-Newton acceleration of EM and MM algorithms via Broyden's method

The principle of majorization-minimization (MM) provides a general frame...

Please sign up or login with your details

Forgot password? Click here to reset