Numerical Integration Method for Training Neural Network

02/02/2019
by   Sho Sonoda, et al.
0

We propose a new numerical integration method for training a shallow neural network by using the ridgelet transform with a fast convergence guarantee. Given a training dataset, the ridgelet transform can provide the parameters of the neural network that attains the global optimum of the training problem. In other words, we can obtain the global minimizer of the training problem by numerically computing the ridgelet transform, instead of by numerically optimizing the so-called backpropagation training problem. We employed the kernel quadrature for the basis of the numerical integration, because it is known to converge faster, i.e. O(1/p) with the hidden unit number p, than other random methods, i.e. O(1/√(p)), such as Monte Carlo integration methods. Originally, the kernel quadrature has been developed for the purpose of computing posterior means, where the measure is assumed to be a probability measure, and the final product is a single number. On the other hand, our problem is the computation of an integral transform, where the measure is generally a signed measure, and the final product is a function. In addition, the performance of kernel quadrature is sensitive to the selection of its kernel. In this paper, we develop a generalized kernel quadrature method with a fast convergence guarantee in a function norm that is applicable to signed measures, and propose a natural choice of kernels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2022

Hypercontractivity Meets Random Convex Hulls: Analysis of Randomized Multivariate Cubatures

Given a probability measure μ on a set 𝒳 and a vector-valued function φ,...
research
06/11/2017

On the Sampling Problem for Kernel Quadrature

The standard Kernel Quadrature method for numerical integration with ran...
research
11/11/2021

Haar-Weave-Metropolis kernel

Recently, many Markov chain Monte Carlo methods have been developed with...
research
02/11/2018

Quadrature-based features for kernel approximation

We consider the problem of improving kernel approximation via randomized...
research
01/03/2020

Monte-Carlo cubature construction

In numerical integration, cubature methods are effective, in particular ...
research
05/19/2018

Integral representation of the global minimizer

We have obtained an integral representation of the shallow neural networ...
research
07/26/2023

Fixed Integral Neural Networks

It is often useful to perform integration over learned functions represe...

Please sign up or login with your details

Forgot password? Click here to reset