
On √(n)consistency for Bayesian quantile regression based on the misspecified asymmetric Laplace likelihood
The asymmetric Laplace density (ALD) is used as a working likelihood for...
read it

Bayesian Panel Quantile Regression for Binary Outcomes with Correlated Random Effects: An Application on Crime Recidivism in Canada
This article develops a Bayesian approach for estimating panel quantile ...
read it

Noncrossing simultaneous Bayesian quantile curve fitting
Bayesian simultaneous estimation of nonparametric quantile curves is a c...
read it

Likelihood Inflating Sampling Algorithm
Markov Chain Monte Carlo (MCMC) sampling from a posterior distribution c...
read it

The polargeneralized normal distribution
This paper introduces an extension to the normal distribution through th...
read it

Bayesian L_1/2 regression
It is well known that bridge regression enjoys superior theoretical prop...
read it

What is the probability that a vaccinated person is shielded from Covid19? A Bayesian MCMC based reanalysis of published data with emphasis on what should be reported as `effi
Based on the information communicated in press releases, and finally pub...
read it
Quantile Regression Neural Networks: A Bayesian Approach
This article introduces a Bayesian neural network estimation method for quantile regression assuming an asymmetric Laplace distribution (ALD) for the response variable. It is shown that the posterior distribution for feedforward neural network quantile regression is asymptotically consistent under a misspecified ALD model. This consistency proof embeds the problem from density estimation domain and uses bounds on the bracketing entropy to derive the posterior consistency over Hellinger neighborhoods. This consistency result is shown in the setting where the number of hidden nodes grow with the sample size. The Bayesian implementation utilizes the normalexponential mixture representation of the ALD density. The algorithm uses Markov chain Monte Carlo (MCMC) simulation technique  Gibbs sampling coupled with MetropolisHastings algorithm. We have addressed the issue of complexity associated with the aforementioned MCMC implementation in the context of chain convergence, choice of starting values, and step sizes. We have illustrated the proposed method with simulation studies and real data examples.
READ FULL TEXT
Comments
There are no comments yet.