Error Bound of Empirical ℓ_2 Risk Minimization for Noisy Standard and Generalized Phase Retrieval Problems
A noisy generalized phase retrieval (NGPR) problem refers to a problem of estimating x_0 ∈ℂ^d by noisy quadratic samples {x_0^*A_kx_0+η_k}_k=1^n where A_k is a Hermitian matrix and η_k is a noise scalar. When A_k=α_kα_k^* for some α_k∈ℂ^d, it reduces to a standard noisy phase retrieval (NPR) problem. The main aim of this paper is to study the estimation performance of empirical ℓ_2 risk minimization in both problems when A_k in NGPR, or α_k in NPR, is drawn from sub-Gaussian distribution. Under different kinds of noise patterns, we establish error bounds that can imply approximate reconstruction and these results are new in the literature. In NGPR, we show the bounds are of O(||η||/√(n)) and O(||η||_∞√(d/n)) for general noise, and of O(√(dlog n/n)) and O(√(d(log n)^2/n)) for random noise with sub-Gaussian and sub-exponential tail respectively, where η and η_∞ are the 2-norm and sup-norm of the noise vector of η_k. Under heavy-tailed noise, by truncating response outliers we propose a robust estimator that possesses an error bound with slower convergence rate. On the other hand, we obtain in NPR the bound is of O(√(dlog n/n)) and O(√(d(log n)^2/n))) for sub-Gaussian and sub-exponential noise respectively, which is essentially tighter than the existing bound O(||η||_2/√(n)). Although NGPR involving measurement matrix A_k is more computationally demanding than NPR involving measurement vector α_k, our results reveal that NGPR exhibits stronger robustness than NPR under biased and deterministic noise. Experimental results are presented to confirm and demonstrate our theoretical findings.
READ FULL TEXT