Fixed-point and coordinate descent algorithms for regularized kernel methods

08/30/2010
by   Francesco Dinuzzo, et al.
0

In this paper, we study two general classes of optimization algorithms for kernel methods with convex loss function and quadratic norm regularization, and analyze their convergence. The first approach, based on fixed-point iterations, is simple to implement and analyze, and can be easily parallelized. The second, based on coordinate descent, exploits the structure of additively separable loss functions to compute solutions of line searches in closed form. Instances of these general classes of algorithms are already incorporated into state of the art machine learning software for large scale problems. We start from a solution characterization of the regularized problem, obtained using sub-differential calculus and resolvents of monotone operators, that holds for general convex loss functions regardless of differentiability. The two methodologies described in the paper can be regarded as instances of non-linear Jacobi and Gauss-Seidel algorithms, and are both well-suited to solve large scale problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2023

Learning to Warm-Start Fixed-Point Optimization Algorithms

We introduce a machine-learning framework to warm-start fixed-point opti...
research
02/10/2020

Anderson Acceleration Using the H^-s Norm

Anderson acceleration (AA) is a technique for accelerating the convergen...
research
07/21/2021

Neural Fixed-Point Acceleration for Convex Optimization

Fixed-point iterations are at the heart of numerical computing and are o...
research
10/31/2018

A general system of differential equations to model first order adaptive algorithms

First order optimization algorithms play a major role in large scale mac...
research
09/23/2013

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

We study the performance of a family of randomized parallel coordinate d...
research
05/25/2019

A Kernel Loss for Solving the Bellman Equation

Value function learning plays a central role in many state-of-the-art re...
research
11/13/2020

A Homotopy Coordinate Descent Optimization Method for l_0-Norm Regularized Least Square Problem

This paper proposes a homotopy coordinate descent (HCD) method to solve ...

Please sign up or login with your details

Forgot password? Click here to reset