DeepAI AI Chat
Log In Sign Up

Discrepancy Minimization via a Self-Balancing Walk

06/24/2020
by   Ryan Alweiss, et al.
MIT
Princeton University
Stanford University
0

We study discrepancy minimization for vectors in ℝ^n under various settings. The main result is the analysis of a new simple random process in multiple dimensions through a comparison argument. As corollaries, we obtain bounds which are tight up to logarithmic factors for several problems in online vector balancing posed by Bansal, Jiang, Singla, and Sinha (STOC 2020), as well as linear time algorithms for logarithmic bounds for the Komlós conjecture.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/14/2021

A Gaussian fixed point random walk

In this note, we design a discrete random walk on the real line which ta...
11/13/2021

Prefix Discrepancy, Smoothed Analysis, and Combinatorial Vector Balancing

A well-known result of Banaszczyk in discrepancy theory concerns the pre...
05/02/2022

A Unified Approach to Discrepancy Minimization

We study a unified approach and algorithm for constructive discrepancy m...
02/04/2021

Online Discrepancy Minimization via Persistent Self-Balancing Walks

We study the online discrepancy minimization problem for vectors in ℝ^d ...
11/10/2022

Discrepancy Minimization via Regularization

We introduce a new algorithmic framework for discrepancy minimization ba...
11/11/2021

Online Discrepancy with Recourse for Vectors and Graphs

The vector-balancing problem is a fundamental problem in discrepancy the...
03/16/2021

Variational Quantum Algorithms for Euclidean Discrepancy and Covariate-Balancing

Algorithmic discrepancy theory seeks efficient algorithms to find those ...