Numerical Differentiation using local Chebyshev-Approximation

02/03/2021
by   Stefan H. Reiterer, et al.
0

In applied mathematics, especially in optimization, functions are often only provided as so called "Black-Boxes" provided by software packages, or very complex algorithms, which make automatic differentation very complicated or even impossible. Hence one seeks the numerical approximation of the derivative. Unfortunately numerical differentation is a difficult task in itself, and it is well known that it is numerical instable. There are many works on this topic, including the usage of (global) Chebyshev approximations. Chebyshev approximations have the great property that they converge very fast, if the function is smooth. Nevertheless those approches have several drawbacks, since in practice functions are not smooth, and a global approximation needs many function evalutions. Nevertheless there is hope. Since functions in real world applications are most times smooth except for finite points, corners or edges. This motivates to use a local Chebyshev approach, where the function is only approximated locally, and hence the Chebyshev approximations still yields a fast approximation of the desired function. We will study such an approch in this work, and will provide a numerical example

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro