Minimizing Convex Functions with Integral Minimizers
Given a separation oracle 𝖲𝖮 for a convex function f that has an integral minimizer inside a box with radius R, we show how to efficiently find a minimizer of f using at most O(n (n + log(R))) calls to 𝖲𝖮. When the set of minimizers of f has integral extreme points, our algorithm outputs an integral minimizer of f. This improves upon the previously best oracle complexity of O(n^2 (n + log(R))) obtained by an elegant application of simultaneous diophantine approximation due to [Grötschel, Lovász and Schrijver, Prog. Comb. Opt. 1984, Springer 1988] over thirty years ago. We conjecture that our oracle complexity is tight up to constant factors. Our result immediately implies a strongly polynomial algorithm for the Submodular Function Minimization problem that makes at most O(n^3) calls to an evaluation oracle. This improves upon the previously best O(n^3 log^2(n)) oracle complexity for strongly polynomial algorithms given in [Lee, Sidford and Wong, FOCS 2015] and [Dadush, Végh and Zambelli, SODA 2018], and an exponential time algorithm with oracle complexity O(n^3 log(n)) given in the former work. Our result is achieved by an application of the LLL algorithm [Lenstra, Lenstra and Lovász, Math. Ann. 1982] for the shortest lattice vector problem. We show how an approximately shortest vector of certain lattice can be used to reduce the dimension of the problem, and how the oracle complexity of such a procedure is advantageous compared with the Grötschel-Lovász-Schrijver approach that uses simultaneous diophantine approximation. Our analysis of the oracle complexity is based on a potential function that captures simultaneously the size of the search set and the density of the lattice. To achieve the O(n^2) term in the oracle complexity, technical ingredients from convex geometry are applied.
READ FULL TEXT