Asymptotic Theory of Expectile Neural Networks

10/31/2020
by   Jinghang Lin, et al.
0

Neural networks are becoming an increasingly important tool in applications. However, neural networks are not widely used in statistical genetics. In this paper, we propose a new neural networks method called expectile neural networks. When the size of parameter is too large, the standard maximum likelihood procedures may not work. We use sieve method to constrain parameter space. And we prove its consistency and normality under nonparametric regression framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2021

The QR decomposition for radial neural networks

We provide a theoretical framework for neural networks in terms of the r...
research
11/01/2021

Statistical Inference in Parametric Preferential Attachment Trees

The preferential attachment (PA) model is a popular way of modeling dyna...
research
02/07/2020

Nonparametric Regression Quantum Neural Networks

In two pervious papers <cit.>, <cit.>, the first author constructed the ...
research
06/22/2022

Consistency of Neural Networks with Regularization

Neural networks have attracted a lot of attention due to its success in ...
research
10/27/2022

Deepening Neural Networks Implicitly and Locally via Recurrent Attention Strategy

More and more empirical and theoretical evidence shows that deepening ne...
research
04/18/2017

Maximum Likelihood Estimation based on Random Subspace EDA: Application to Extrasolar Planet Detection

This paper addresses maximum likelihood (ML) estimation based model fitt...
research
05/26/2021

Using the Overlapping Score to Improve Corruption Benchmarks

Neural Networks are sensitive to various corruptions that usually occur ...

Please sign up or login with your details

Forgot password? Click here to reset