Fast Large-Scale Discrete Optimization Based on Principal Coordinate Descent

09/16/2019
by   Huan Xiong, et al.
0

Binary optimization, a representative subclass of discrete optimization, plays an important role in mathematical optimization and has various applications in computer vision and machine learning. Usually, binary optimization problems are NP-hard and difficult to solve due to the binary constraints, especially when the number of variables is very large. Existing methods often suffer from high computational costs or large accumulated quantization errors, or are only designed for specific tasks. In this paper, we propose a fast algorithm to find effective approximate solutions for general binary optimization problems. The proposed algorithm iteratively solves minimization problems related to the linear surrogates of loss functions, which leads to the updating of some binary variables most impacting the value of loss functions in each step. Our method supports a wide class of empirical objective functions with/without restrictions on the numbers of 1s and -1s in the binary variables. Furthermore, the theoretical convergence of our algorithm is proven, and the explicit convergence rates are derived, for objective functions with Lipschitz continuous gradients, which are commonly adopted in practice. Extensive experiments on several binary optimization tasks and large-scale datasets demonstrate the superiority of the proposed algorithm over several state-of-the-art methods in terms of both effectiveness and efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2020

Model identification and local linear convergence of coordinate descent

For composite nonsmooth optimization problems, Forward-Backward algorith...
research
10/03/2019

Best-first Search Algorithm for Non-convex Sparse Minimization

Non-convex sparse minimization (NSM), or ℓ_0-constrained minimization of...
research
11/17/2017

A generic and fast C++ optimization framework

The development of the mlpack C++ machine learning library (http://www.m...
research
08/27/2019

New Loss Functions for Fast Maximum Inner Product Search

Quantization based methods are popular for solving large scale maximum i...
research
08/30/2022

Using Taylor-Approximated Gradients to Improve the Frank-Wolfe Method for Empirical Risk Minimization

The Frank-Wolfe method has become increasingly useful in statistical and...
research
10/09/2020

A Decentralized Multi-Objective Optimization Algorithm

During the past two decades, multi-agent optimization problems have draw...
research
08/28/2019

Machine-learning techniques for the optimal design of acoustic metamaterials

Recently, an increasing research effort has been dedicated to analyse th...

Please sign up or login with your details

Forgot password? Click here to reset