Additive Models with Trend Filtering

02/16/2017
by   Veeranjaneyulu Sadhanala, et al.
0

We consider additive models built with trend filtering, i.e., additive models whose components are each regularized by the (discrete) total variation of their (k+1)st (discrete) derivative, for a chosen integer k ≥ 0. This results in kth degree piecewise polynomial components, (e.g., k=0 gives piecewise constant components, k=1 gives piecewise linear, k=2 gives piecewise quadratic, etc.). In univariate nonparametric regression, the localized nature of the total variation regularizer used by trend filtering has been shown to produce estimates with superior local adaptivity to those from smoothing splines (and linear smoothers, more generally) (Tibshirani [2014]). Further, the structured nature of this regularizer has been shown to lead to highly efficient computational routines for trend filtering (Kim et al. [2009], Ramdas and Tibshirani [2016]). In this paper, we argue that both of these properties carry over to the additive models setting. We derive fast error rates for additive trend filtering estimates, and prove that these rates are minimax optimal when the underlying function is itself additive and has component functions whose derivatives are of bounded variation. We show that such rates are unattainable by additive smoothing splines (and by additive models built from linear smoothers, in general). We argue that backfitting provides an efficient algorithm for additive trend filtering, as it is built around the fast univariate trend filtering solvers; moreover, we describe a modified backfitting procedure whose iterations can be run in parallel. Finally, we conduct experiments to examine the empirical properties of additive trend filtering, and outline some possible extensions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset