Acceleration of the kernel herding algorithm by improved gradient approximation
Kernel herding is a method used to construct quadrature formulas in a reproducing kernel Hilbert space. Although there are some advantages of kernel herding, such as numerical stability of quadrature and effective outputs of nodes and weights, the convergence speed of worst-case integration error is slow in comparison to other quadrature methods. To address this problem, we propose two improved versions of the kernel herding algorithm. The fundamental concept of both algorithms involves approximating negative gradients with a positive linear combination of vertex directions. We analyzed the convergence and validity of both algorithms theoretically; in particular, we showed that the approximation of negative gradients directly influences the convergence speed. In addition, we confirmed the accelerated convergence of the worst-case integration error with respect to the number of nodes and computational time through numerical experiments.
READ FULL TEXT