Prediction bounds for (higher order) total variation regularized least squares

04/24/2019
by   Sara van de Geer, et al.
0

We establish oracle inequalities for the least squares estimator f̂ with penalty on the total variation of f̂ or on its higher order differences. Our main tool is an interpolating vector that leads to lower bounds for compatibility constants. This allows one to show that for any N ∈N the N^ th order differences penalty leads to an estimator f̂ that can adapt to the number of jumps in the (N-1)^ th order differences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2019

Prediction bounds for (higher order) total variationregularized least squares

We establish oracle inequalities for the least squares estimator f̂ with...
research
04/03/2018

On tight bounds for the Lasso

We present upper and lower bounds for the prediction error of the Lasso....
research
07/09/2021

Higher Order Imprecise Probabilities and Statistical Testing

We generalize standard credal set models for imprecise probabilities to ...
research
03/05/2020

Logistic regression with total variation regularization

We study logistic regression with total variation penalty on the canonic...
research
06/04/2018

On the total variation regularized estimator over a class of tree graphs

We generalize to tree graphs obtained by connecting path graphs an oracl...
research
12/18/2013

The Total Variation on Hypergraphs - Learning on Hypergraphs Revisited

Hypergraphs allow one to encode higher-order relationships in data and a...
research
11/13/2019

Superiorization vs. Accelerated Convex Optimization: The Superiorized/Regularized Least-Squares Case

In this paper we conduct a study of both superiorization and optimizatio...

Please sign up or login with your details

Forgot password? Click here to reset