Bounds in L^1 Wasserstein distance on the normal approximation of general M-estimators

11/18/2021
by   François Bachoc, et al.
0

We derive quantitative bounds on the rate of convergence in L^1 Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the number of observations. We focus on situations where the estimator does not have an explicit expression as a function of the data. The general method may be applied even in situations where the observations are not independent. Our main application is a rate of convergence for cross validation estimation of covariance parameters of Gaussian processes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/11/2020

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

We obtain explicit Wasserstein distance error bounds between the distrib...
research
06/26/2022

The Sketched Wasserstein Distance for mixture distributions

The Sketched Wasserstein Distance (W^S) is a new probability distance sp...
research
04/01/2019

Optimal Fusion of Elliptic Extended Target Estimates based on the Wasserstein Distance

This paper considers the fusion of multiple estimates of a spatially ext...
research
11/04/2021

Rate of Convergence of Polynomial Networks to Gaussian Processes

We examine one-hidden-layer neural networks with random weights. It is w...
research
11/05/2020

Statistical analysis of Wasserstein GANs with applications to time series forecasting

We provide statistical theory for conditional and unconditional Wasserst...
research
04/20/2023

Minimum Φ-distance estimators for finite mixing measures

Finite mixture models have long been used across a variety of fields in ...

Please sign up or login with your details

Forgot password? Click here to reset