
Radial basis function process neural network training based on generalized frechet distance and GASA hybrid strategy
For learning problem of Radial Basis Function Process Neural Network (RB...
read it

Nearest Neighbour Radial Basis Function Solvers for Deep Neural Networks
We present a radial basis function solver for convolutional neural netwo...
read it

Pricing Derivatives under Multiple Stochastic Factors by Localized Radial Basis Function Methods
We propose two localized Radial Basis Function (RBF) methods, the Radial...
read it

Towards Expressive Priors for Bayesian Neural Networks: Poisson Process Radial Basis Function Networks
While Bayesian neural networks have many appealing characteristics, curr...
read it

Comparison between compactlysupported spherical radial basis functions and interpolating moving least squares meshless interpolants for gravity data interpolation in geodesy a
The present paper is focused on the comparison of the efficiency of two ...
read it

A stable algorithm for divergencefree radial basis functions in the flat limit
The direct method used for calculating smooth radial basis function (RBF...
read it

Active preference learning based on radial basis functions
This paper proposes a method for solving optimization problems in which ...
read it
An Exact Reformulation of FeatureVectorbased RadialBasisFunction Networks for Graphbased Observations
Radialbasisfunction networks are traditionally defined for sets of vectorbased observations. In this short paper, we reformulate such networks so that they can be applied to adjacencymatrix representations of weighted, directed graphs that represent the relationships between object pairs. We restate the sumofsquares objective function so that it is purely dependent on entries from the adjacency matrix. From this objective function, we derive a gradient descent update for the network weights. We also derive a gradient update that simulates the repositioning of the radial basis prototypes and changes in the radial basis prototype parameters. An important property of our radial basis function networks is that they are guaranteed to yield the same responses as conventional radialbasis networks trained on a corresponding vector realization of the relationships encoded by the adjacencymatrix. Such a vector realization only needs to provably exist for this property to hold, which occurs whenever the relationships correspond to distances from some arbitrary metric applied to a latent set of vectors. We therefore completely avoid needing to actually construct vectorial realizations via multidimensional scaling, which ensures that the underlying relationships are totally preserved.
READ FULL TEXT VIEW PDF
Comments
There are no comments yet.