Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection

10/24/2019
by   David Gamarnik, et al.
0

We focus on the high-dimensional linear regression problem, where the algorithmic goal is to efficiently infer an unknown feature vector β^*∈R^p from its linear measurements, using a small number n of samples. Unlike most of the literature, we make no sparsity assumption on β^*, but instead adopt a different regularization: In the noiseless setting, we assume β^* consists of entries, which are either rational numbers with a common denominator Q∈Z^+ (referred to as Q-rationality); or irrational numbers supported on a rationally independent set of bounded cardinality, known to learner; collectively called as the mixed-support assumption. Using a novel combination of the PSLQ integer relation detection, and LLL lattice basis reduction algorithms, we propose a polynomial-time algorithm which provably recovers a β^*∈R^p enjoying the mixed-support assumption, from its linear measurements Y=Xβ^*∈R^n for a large class of distributions for the random entries of X, even with one measurement (n=1). In the noisy setting, we propose a polynomial-time, lattice-based algorithm, which recovers a β^*∈R^p enjoying Q-rationality, from its noisy measurements Y=Xβ^*+W∈R^n, even with a single sample (n=1). We further establish for large Q, and normal noise, this algorithm tolerates information-theoretically optimal level of noise. We then apply these ideas to develop a polynomial-time, single-sample algorithm for the phase retrieval problem. Our methods address the single-sample (n=1) regime, where the sparsity-based methods such as LASSO and Basis Pursuit are known to fail. Furthermore, our results also reveal an algorithmic connection between the high-dimensional linear regression problem, and the integer relation detection, randomized subset-sum, and shortest vector problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2018

High Dimensional Linear Regression using Lattice Basis Reduction

We consider a high dimensional linear regression problem where the goal ...
research
11/14/2017

Sparse High-Dimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm

We consider a sparse high dimensional regression model where the goal is...
research
12/05/2013

Swapping Variables for High-Dimensional Sparse Regression with Correlated Measurements

We consider the high-dimensional sparse linear regression problem of acc...
research
05/26/2023

Feature Adaptation for Sparse Linear Regression

Sparse linear regression is a central problem in high-dimensional statis...
research
05/19/2017

Linear regression without correspondence

This article considers algorithmic and statistical aspects of linear reg...
research
09/20/2023

GLM Regression with Oblivious Corruptions

We demonstrate the first algorithms for the problem of regression for ge...
research
10/12/2018

An Algebraic-Geometric Approach to Shuffled Linear Regression

Shuffled linear regression is the problem of performing a linear regress...

Please sign up or login with your details

Forgot password? Click here to reset