Sparse Nonlinear Regression: Parameter Estimation and Asymptotic Inference

11/14/2015
by   Zhuoran Yang, et al.
0

We study parameter estimation and asymptotic inference for sparse nonlinear regression. More specifically, we assume the data are given by y = f( x^β^* ) + ϵ, where f is nonlinear. To recover β^*, we propose an ℓ_1-regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f. In spite of the nonconvexity, we prove that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. In addition, we provide an efficient algorithm that provably converges to a stationary point. We also access the uncertainty of the obtained estimator. Specifically, based on any stationary point of the objective, we construct valid hypothesis tests and confidence intervals for the low dimensional components of the high-dimensional parameter β^*. Detailed numerical results are provided to back up our theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro