Partial minimization of strict convex functions and tensor scaling

05/26/2019
by   Shmuel Friedland, et al.
0

Assume that f is a strict convex function with a unique minimum in R^n. We divide the vector of n-variables to d groups of vector subvariables with d at least two. We assume that we can find the partial minimum of f with respect to each vector subvariable while other variables are fixed. We then describe an algorithm that partially minimizes each time on a specifically chosen vector subvariable. This algorithm converges geometrically to the unique minimum. The rate of convergence depends on the uniform bounds on the eigenvalues of the Hessian of f in the compact sublevel set f whose values are at most f(x_0), where x_0 is the starting point of the algorithm. In the case where f is a polynomial of degree two, with positive definite quadratic term, and d=n our method can be considered as a generalization of the classical conjugate gradient method. The main result of this paper is the observation that the celebrated Sinkhorn diagonal scaling algorithm for matrices, and the corresponding diagonal scaling of tensors, can be viewed as partial minimization of certain logconvex functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2020

Tensor optimal transport, distance between sets of measures and tensor scaling

We study the optimal transport problem for d>2 discrete measures. This i...
research
09/25/2018

Hessian barrier algorithms for linearly constrained optimization problems

In this paper, we propose an interior-point method for linearly constrai...
research
09/03/2022

Quadratic Gradient: Uniting Gradient Algorithm and Newton Method as One

It might be inadequate for the line search technique for Newton's method...
research
09/27/2022

An O(3.82^k) Time FPT Algorithm for Convex Flip Distance

Let P be a convex polygon in the plane, and let T be a triangulation of ...
research
05/31/2020

Revisiting Frank-Wolfe for Polytopes: Strict Complementary and Sparsity

In recent years it was proved that simple modifications of the classical...
research
07/07/2020

Robust and effective eSIF preconditioning for general SPD matrices

We propose an unconditionally robust and highly effective preconditioner...
research
12/02/2012

Message-Passing Algorithms for Quadratic Minimization

Gaussian belief propagation (GaBP) is an iterative algorithm for computi...

Please sign up or login with your details

Forgot password? Click here to reset