Approximation with Conditionally Positive Definite Kernels on Deficient Sets

06/24/2020
by   Oleg Davydov, et al.
0

Interpolation and approximation of functionals with conditionally positive definite kernels is considered on sets of centers that are not determining for polynomials. It is shown that polynomial consistency is sufficient in order to define kernel-based numerical approximation of the functional with usual properties of optimal recovery. Application examples include generation of sparse kernel-based numerical differentiation formulas for the Laplacian on a grid and accurate approximation of a function on an ellipse.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2021

Positive Definite Multi-Kernels for Scattered Data Interpolations

In this article, we use the knowledge of positive definite tensors to de...
research
12/15/2022

Interpolation with the polynomial kernels

The polynomial kernels are widely used in machine learning and they are ...
research
08/05/2019

Selection of Sparse Sets of Influence for Meshless Finite Difference Methods

We suggest an efficient algorithm for the selection of sparse subsets of...
research
12/04/2019

Graph signal interpolation with Positive Definite Graph Basis Functions

For the interpolation of graph signals with generalized shifts of a grap...
research
09/06/2018

Large Scale Learning with Kreĭn Kernels

We extend the Nyström method for low-rank approximation of positive defi...
research
10/09/2019

Kernels over Sets of Finite Sets using RKHS Embeddings, with Application to Bayesian (Combinatorial) Optimization

We focus on kernel methods for set-valued inputs and their application t...
research
10/09/2015

On the Definiteness of Earth Mover's Distance Yields and Its Relation to Set Intersection

Positive definite kernels are an important tool in machine learning that...

Please sign up or login with your details

Forgot password? Click here to reset