Oracle inequalities for sign constrained generalized linear models
High-dimensional data have recently been analyzed because of data collection technology evolution. Although many methods have been developed to gain sparse recovery in the past two decades, most of these methods require selection of tuning parameters. As a consequence of this feature, results obtained with these methods heavily depend on the tuning. In this paper we study the theoretical properties of sign-constrained generalized linear models with convex loss function, which is one of the sparse regression methods without tuning parameters. Recent studies on this topic have shown that, in the case of linear regression, sign-constrains alone could be as efficient as the oracle method if the design matrix enjoys a suitable assumption in addition to a traditional compatibility condition. We generalize this kind of result to a much more general model which encompasses the logistic and quantile regressions. We also perform some numerical experiments to confirm theoretical findings obtained in this paper.
READ FULL TEXT