Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection

10/24/2019
by   David Gamarnik, et al.
0

We focus on the high-dimensional linear regression problem, where the algorithmic goal is to efficiently infer an unknown feature vector β^*∈R^p from its linear measurements, using a small number n of samples. Unlike most of the literature, we make no sparsity assumption on β^*, but instead adopt a different regularization: In the noiseless setting, we assume β^* consists of entries, which are either rational numbers with a common denominator Q∈Z^+ (referred to as Q-rationality); or irrational numbers supported on a rationally independent set of bounded cardinality, known to learner; collectively called as the mixed-support assumption. Using a novel combination of the PSLQ integer relation detection, and LLL lattice basis reduction algorithms, we propose a polynomial-time algorithm which provably recovers a β^*∈R^p enjoying the mixed-support assumption, from its linear measurements Y=Xβ^*∈R^n for a large class of distributions for the random entries of X, even with one measurement (n=1). In the noisy setting, we propose a polynomial-time, lattice-based algorithm, which recovers a β^*∈R^p enjoying Q-rationality, from its noisy measurements Y=Xβ^*+W∈R^n, even with a single sample (n=1). We further establish for large Q, and normal noise, this algorithm tolerates information-theoretically optimal level of noise. We then apply these ideas to develop a polynomial-time, single-sample algorithm for the phase retrieval problem. Our methods address the single-sample (n=1) regime, where the sparsity-based methods such as LASSO and Basis Pursuit are known to fail. Furthermore, our results also reveal an algorithmic connection between the high-dimensional linear regression problem, and the integer relation detection, randomized subset-sum, and shortest vector problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset