A Variational View on Statistical Multiscale Estimation

06/10/2021
by   Markus Haltmeier, et al.
0

We present a unifying view on various statistical estimation techniques including penalization, variational and thresholding methods. These estimators will be analyzed in the context of statistical linear inverse problems including nonparametric and change point regression, and high dimensional linear models as examples. Our approach reveals many seemingly unrelated estimation schemes as special instances of a general class of variational multiscale estimators, named MIND (MultIscale Nemirovskii–Dantzig). These estimators result from minimizing certain regularization functionals under convex constraints that can be seen as multiple statistical tests for local hypotheses. For computational purposes, we recast MIND in terms of simpler unconstraint optimization problems via Lagrangian penalization as well as Fenchel duality. Performance of several MINDs is demonstrated on numerical examples.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 26

10/20/2020

Variational Multiscale Nonparametric Regression: Algorithms and Implementation

Many modern statistically efficient methods come with tremendous computa...
05/21/2019

Total variation multiscale estimators for linear inverse problems

Even though the statistical theory of linear inverse problems is a well-...
05/21/2019

Semi-Lagrangian Subgrid Reconstruction for Advection-Dominant Multiscale Problems

We introduce a new framework of numerical multiscale methods for advecti...
01/23/2011

Statistical Multiresolution Dantzig Estimation in Imaging: Fundamental Concepts and Algorithmic Framework

In this paper we are concerned with fully automatic and locally adaptive...
08/11/2014

Optimum Statistical Estimation with Strategic Data Sources

We propose an optimum mechanism for providing monetary incentives to the...
12/23/2020

Data segmentation algorithms: Univariate mean change and beyond

Data segmentation a.k.a. multiple change point analysis has received con...
08/28/2021

Avoiding unwanted results in locally linear embedding: A new understanding of regularization

We demonstrate that locally linear embedding (LLE) inherently admits som...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.