On optimal convergence rates of spectral orthogonal projection approximation for functions of algbraic and logarithmatic regularities

06/02/2020
by   Shuhuang Xiang, et al.
0

Based on the Hilb type formula between Jacobi polynomials and Bessel functions, optimal decay rates on Jacobi expansion coefficients are derived by applying van der Corput type lemmas for functions of logarithmatic singularities, which leads to the optimal convergence rates on the Jacobi, Gegenbauer and Chebyshev orthogonal projections. It is interesting to see that for boundary singularities, one may get faster convergence rate on the Jacobi or Gegenbauer projection as (α,β) and λ increases. The larger values of parameter, the higher convergence rates can be achieved. In particular, the Legendre projection has one half order higher than Chebyshev. Moreover, if min{α,β}>0 and λ>1/2, the Jacobi and Gegenbauer orthogonal projections have higher convergence orders compared with Legendre. While for interior singularity, the convergence order is independent of (α,β) and λ.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2020

Convergence rates of spectral orthogonal projection approximation for functions of algebraic and logarithmatic regularities

Based on the Hilb type formula between Jacobi polynomials and Bessel fun...
research
10/14/2019

Uniform convergence rates for the approximated halfspace and projection depth

The computational complexity of some depths that satisfy the projection ...
research
08/02/2023

Improved convergence rates of nonparametric penalized regression under misspecified total variation

Penalties that induce smoothness are common in nonparametric regression....
research
02/19/2021

Brownian bridge expansions for Lévy area approximations and particular values of the Riemann zeta function

We study approximations for the Lévy area of Brownian motion which are b...
research
10/19/2021

Faster Rates for the Frank-Wolfe Algorithm Using Jacobi Polynomials

The Frank Wolfe algorithm (FW) is a popular projection-free alternative ...
research
06/10/2015

Convergence rates for pretraining and dropout: Guiding learning parameters using network structure

Unsupervised pretraining and dropout have been well studied, especially ...
research
06/28/2021

Improved Convergence Rates for the Orthogonal Greedy Algorithm

We analyze the orthogonal greedy algorithm when applied to dictionaries ...

Please sign up or login with your details

Forgot password? Click here to reset