Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

05/26/2017
by   Daiki Ikami, et al.
0

We propose the residual expansion (RE) algorithm: a global (or near-global) optimization method for nonconvex least squares problems. Unlike most existing nonconvex optimization techniques, the RE algorithm is not based on either stochastic or multi-point searches; therefore, it can achieve fast global optimization. Moreover, the RE algorithm is easy to implement and successful in high-dimensional optimization. The RE algorithm exhibits excellent empirical performance in terms of k-means clustering, point-set registration, optimized product quantization, and blind image deblurring.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2016

Fast Stochastic Methods for Nonsmooth Nonconvex Optimization

We analyze stochastic algorithms for optimizing nonconvex, nonsmooth fin...
research
03/19/2016

Fast Incremental Method for Nonconvex Optimization

We analyze a fast incremental aggregated gradient method for optimizing ...
research
12/05/2019

Analysis of the Optimization Landscapes for Overcomplete Representation Learning

We study nonconvex optimization landscapes for learning overcomplete rep...
research
05/30/2015

Saddle-free Hessian-free Optimization

Nonconvex optimization problems such as the ones in training deep neural...
research
03/31/2022

A simplified nonsmooth nonconvex bundle method with applications to security-constrained ACOPF problems

An optimization algorithm for a group of nonsmooth nonconvex problems in...
research
10/27/2014

A Greedy Homotopy Method for Regression with Nonconvex Constraints

Constrained least squares regression is an essential tool for high-dimen...

Please sign up or login with your details

Forgot password? Click here to reset