Structural Risk Minimization for C^1,1(R^d) Regression

03/29/2018
by   Adam Gustafson, et al.
0

One means of fitting functions to high-dimensional data is by providing smoothness constraints. Recently, the following smooth function approximation problem was proposed by herbert2014computing: given a finite set E ⊂R^d and a function f: E →R, interpolate the given information with a function f∈Ċ^1, 1(R^d) (the class of first-order differentiable functions with Lipschitz gradients) such that f(a) = f(a) for all a ∈ E, and the value of Lip(∇f) is minimal. An algorithm is provided that constructs such an approximating function f and estimates the optimal Lipschitz constant Lip(∇f) in the noiseless setting. We address statistical aspects of reconstructing the approximating function f from a closely-related class C^1, 1(R^d) given samples from noisy data. We observe independent and identically distributed samples y(a) = f(a) + ξ(a) for a ∈ E, where ξ(a) is a noise term and the set E ⊂R^d is fixed and known. We obtain uniform bounds relating the empirical risk and true risk over the class F_M = {f ∈ C^1, 1(R^d) |Lip(∇ f) ≤M}, where the quantity M grows with the number of samples at a rate governed by the metric entropy of the class C^1, 1(R^d). Finally, we provide an implementation using Vaidya's algorithm, supporting our results via numerical experiments on simulated data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2016

Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back

In stochastic convex optimization the goal is to minimize a convex funct...
research
11/03/2022

Optimal Algorithms for Stochastic Complementary Composite Minimization

Inspired by regularization techniques in statistics and machine learning...
research
02/14/2020

Statistical Learning with Conditional Value at Risk

We propose a risk-averse statistical learning framework wherein the perf...
research
12/02/2021

Recovering Hölder smooth functions from noisy modulo samples

In signal processing, several applications involve the recovery of a fun...
research
11/28/2020

Inference in Regression Discontinuity Designs under Monotonicity

We provide an inference procedure for the sharp regression discontinuity...
research
11/11/2018

Generalization Bounds for Vicinal Risk Minimization Principle

The vicinal risk minimization (VRM) principle, first proposed by vapnik1...
research
08/25/2016

Minimizing Quadratic Functions in Constant Time

A sampling-based optimization method for quadratic functions is proposed...

Please sign up or login with your details

Forgot password? Click here to reset