Noisy linear inverse problems under convex constraints: Exact risk asymptotics in high dimensions

01/20/2022
by   Qiyang Han, et al.
0

In the standard Gaussian linear measurement model Y=Xμ_0+ξ∈ℝ^m with a fixed noise level σ>0, we consider the problem of estimating the unknown signal μ_0 under a convex constraint μ_0 ∈ K, where K is a closed convex set in ℝ^n. We show that the risk of the natural convex constrained least squares estimator (LSE) μ̂(σ) can be characterized exactly in high dimensional limits, by that of the convex constrained LSE μ̂_K^𝗌𝖾𝗊 in the corresponding Gaussian sequence model at a different noise level. The characterization holds (uniformly) for risks in the maximal regime that ranges from constant order all the way down to essentially the parametric rate, as long as certain necessary non-degeneracy condition is satisfied for μ̂(σ). The precise risk characterization reveals a fundamental difference between noiseless (or low noise limit) and noisy linear inverse problems in terms of the sample complexity for signal recovery. A concrete example is given by the isotonic regression problem: While exact recovery of a general monotone signal requires m≫ n^1/3 samples in the noiseless setting, consistent signal recovery in the noisy setting requires as few as m≫log n samples. Such a discrepancy occurs when the low and high noise risk behavior of μ̂_K^𝗌𝖾𝗊 differ significantly. In statistical languages, this occurs when μ̂_K^𝗌𝖾𝗊 estimates 0 at a faster `adaptation rate' than the slower `worst-case rate' for general signals. Several other examples, including non-negative least squares and generalized Lasso (in constrained forms), are also worked out to demonstrate the concrete applicability of the theory in problems of different types.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2020

High dimensional asymptotics of likelihood ratio tests in Gaussian sequence model under convex constraint

In the Gaussian sequence model Y=μ+ξ, we study the likelihood ratio test...
research
04/01/2018

Near-Optimality Recovery of Linear and N-Convex Functions on Unions of Convex Sets

In this paper, following the line of research on "statistical inference ...
research
02/08/2012

Signal Recovery on Incoherent Manifolds

Suppose that we observe noisy linear measurements of an unknown signal t...
research
04/08/2021

A New Perspective on Debiasing Linear Regressions

In this paper, we propose an abstract procedure for debiasing constraine...
research
09/24/2018

Convergence rates for Penalised Least Squares Estimators in PDE-constrained regression problems

We consider PDE constrained nonparametric regression problems in which t...
research
04/01/2018

Near-Optimal Recovery of Linear and N-Convex Functions on Unions of Convex Sets

In this paper, following the line of research on "statistical inference ...
research
12/23/2022

On Design of Polyhedral Estimates in Linear Inverse Problems

Polyhedral estimate is a generic efficiently computable nonlinear in obs...

Please sign up or login with your details

Forgot password? Click here to reset