Smoothing quantile regressions
We propose to smooth the entire objective function, rather than only the check function, in a linear quantile regression context. Not only does the resulting smoothed quantile regression estimator yield a lower mean squared error and a more accurate Bahadur-Kiefer representation than the standard estimator, but it is also asymptotically differentiable. We exploit the latter to propose a quantile density estimator that does not suffer from the curse of dimensionality. This means estimating the conditional density function without worrying about the dimension of the covariate vector. It also allows for two-stage efficient quantile regression estimation. Our asymptotic theory holds uniformly with respect to the bandwidth and quantile level. Finally, we propose a rule of thumb for choosing the smoothing bandwidth that should approximate well the optimal bandwidth. Simulations confirm that our smoothed quantile regression estimator indeed performs very well in finite samples.
READ FULL TEXT