The Bit Complexity of Efficient Continuous Optimization

04/04/2023
by   Mehrdad Ghadiri, et al.
0

We analyze the bit complexity of efficient algorithms for fundamental optimization problems, such as linear regression, p-norm regression, and linear programming (LP). State-of-the-art algorithms are iterative, and in terms of the number of arithmetic operations, they match the current time complexity of multiplying two n-by-n matrices (up to polylogarithmic factors). However, previous work has typically assumed infinite precision arithmetic, and due to complicated inverse maintenance techniques, the actual running times of these algorithms are unknown. To settle the running time and bit complexity of these algorithms, we demonstrate that a core common subroutine, known as inverse maintenance, is backward-stable. Additionally, we show that iterative approaches for solving constrained weighted regression problems can be accomplished with bounded-error pre-conditioners. Specifically, we prove that linear programs can be solved approximately in matrix multiplication time multiplied by polylog factors that depend on the condition number κ of the matrix and the inner and outer radius of the LP problem. p-norm regression can be solved approximately in matrix multiplication time multiplied by polylog factors in κ. Lastly, linear regression can be solved approximately in input-sparsity time multiplied by polylog factors in κ. Furthermore, we present results for achieving lower than matrix multiplication time for p-norm regression by utilizing faster solvers for sparse linear systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2021

Sparse Regression Faster than d^ω

The current complexity of regression is nearly linear in the complexity ...
research
07/03/2023

On Symmetric Factorizations of Hankel Matrices

We present two conjectures regarding the running time of computing symme...
research
12/30/2019

Linear Programming using Limited-Precision Oracles

Since the elimination algorithm of Fourier and Motzkin, many different m...
research
07/20/2020

Solving Sparse Linear Systems Faster than Matrix Multiplication

Can linear systems be solved faster than matrix multiplication? While th...
research
04/16/2020

Faster Dynamic Matrix Inverse for Faster LPs

Motivated by recent Linear Programming solvers, we design dynamic data s...
research
10/30/2019

Iterative Hessian Sketch in Input Sparsity Time

Scalable algorithms to solve optimization and regression tasks even appr...
research
11/03/2017

An homotopy method for ℓ_p regression provably beyond self-concordance and in input-sparsity time

We consider the problem of linear regression where the ℓ_2^n norm loss (...

Please sign up or login with your details

Forgot password? Click here to reset