Hardness Results for Minimizing the Covariance of Randomly Signed Sum of Vectors

11/26/2022
by   Peng Zhang, et al.
0

Given vectors 𝕧_1, …, 𝕧_n ∈ℝ^d with Euclidean norm at most 1 and 𝕩_0 ∈ [-1,1]^n, our goal is to sample a random signing 𝕩∈{± 1}^n with 𝔼[𝕩] = 𝕩_0 such that the operator norm of the covariance of the signed sum of the vectors ∑_i=1^n 𝕩(i) 𝕧_i is as small as possible. This problem arises from the algorithmic discrepancy theory and its application in the design of randomized experiments. It is known that one can sample a random signing with expectation 𝕩_0 and the covariance operator norm at most 1. In this paper, we prove two hardness results for this problem. First, we show it is NP-hard to distinguish a list of vectors for which there exists a random signing with expectation 0 such that the operator norm is 0 from those for which any signing with expectation 0 must have the operator norm Ω(1). Second, we consider 𝕩_0 ∈ [-1,1]^n whose entries are all around an arbitrarily fixed p ∈ [-1,1]. We show it is NP-hard to distinguish a list of vectors for which there exists a random signing with expectation 𝕩_0 such that the operator norm is 0 from those for which any signing with expectation 0 must have the operator norm Ω((1-|p|)^2).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2022

Hardness Results for Weaver's Discrepancy Problem

Marcus, Spielman and Srivastava (Annals of Mathematics 2014) solved the ...
research
05/03/2019

A Uniform Bound of the Operator Norm of Random Element Matrices and Operator Norm Minimizing Estimation

In this paper, we derive a uniform stochastic bound of the operator norm...
research
04/25/2022

Smoothed Analysis of the Komlós Conjecture

The well-known Komlós conjecture states that given n vectors in ℝ^d with...
research
02/11/2019

A Short Note on Concentration Inequalities for Random Vectors with SubGaussian Norm

In this note, we derive concentration inequalities for random vectors wi...
research
06/12/2019

Approximating the Orthogonality Dimension of Graphs and Hypergraphs

A t-dimensional orthogonal representation of a hypergraph is an assignme...
research
12/03/2010

Agnostic Learning of Monomials by Halfspaces is Hard

We prove the following strong hardness result for learning: Given a dist...
research
03/16/2019

On-Line Balancing of Random Inputs

We consider an online vector balancing game where vectors v_t, chosen un...

Please sign up or login with your details

Forgot password? Click here to reset