Consistency of the Bayes Estimator of a Regression Curve

07/19/2022
by   Agustin G. Nogales, et al.
0

Strong consistency of the Bayes estimator of a regression curve for the L^1-squared loss function is proved. It is also shown the convergence to 0 of the Bayes risk of this estimator both for the L^1 and L^1-squared loss functions. The Bayes estimator of a regression curve is the regression curve with respect to the posterior predictive distribution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2021

A Note on Consistency of the Bayes Estimator of the Density

Under mild conditions, it is shown the strong consistency of the Bayes e...
research
02/11/2018

On the Rates of Convergence from Surrogate Risk Minimizers to the Bayes Optimal Classifier

We study the rates of convergence from empirical surrogate risk minimize...
research
08/03/2020

On Bayesian Estimation of Densities and Sampling Distributions: the Posterior Predictive Distribution as the Bayes Estimator

Optimality results for two outstanding Bayesian estimation problems are ...
research
08/05/2018

Dynamical multiple regression in function spaces, under kernel regressors, with ARH(1) errors

A linear multiple regression model in function spaces is formulated, und...
research
10/18/2021

The f-divergence and Loss Functions in ROC Curve

Given two data distributions and a test score function, the Receiver Ope...
research
07/02/2018

A new decision theoretic sampling plan for type-I and type-I hybrid censored samples from the exponential distribution

The study proposes a new decision theoretic sampling plan (DSP) for Type...
research
01/31/2023

Multicalibration as Boosting for Regression

We study the connection between multicalibration and boosting for square...

Please sign up or login with your details

Forgot password? Click here to reset