
Radial basis function process neural network training based on generalized frechet distance and GASA hybrid strategy
For learning problem of Radial Basis Function Process Neural Network (RB...
read it

Nearest Neighbour Radial Basis Function Solvers for Deep Neural Networks
We present a radial basis function solver for convolutional neural netwo...
read it

Pricing Derivatives under Multiple Stochastic Factors by Localized Radial Basis Function Methods
We propose two localized Radial Basis Function (RBF) methods, the Radial...
read it

Meshless Approximation and HelmholtzHodge Decomposition of Vector Fields
The analysis of vector fields is crucial for the understanding of severa...
read it

Towards Expressive Priors for Bayesian Neural Networks: Poisson Process Radial Basis Function Networks
While Bayesian neural networks have many appealing characteristics, curr...
read it

Rotation Invariant Face Detection Using Wavelet, PCA and Radial Basis Function Networks
This paper introduces a novel method for human face detection with its o...
read it

The Pade Approximant Based Network for Variational Problems
In solving the variational problem, the key is to efficiently find the t...
read it
An Exact Reformulation of FeatureVectorbased RadialBasisFunction Networks for Graphbased Observations
Radialbasisfunction networks are traditionally defined for sets of vectorbased observations. In this short paper, we reformulate such networks so that they can be applied to adjacencymatrix representations of weighted, directed graphs that represent the relationships between object pairs. We restate the sumofsquares objective function so that it is purely dependent on entries from the adjacency matrix. From this objective function, we derive a gradient descent update for the network weights. We also derive a gradient update that simulates the repositioning of the radial basis prototypes and changes in the radial basis prototype parameters. An important property of our radial basis function networks is that they are guaranteed to yield the same responses as conventional radialbasis networks trained on a corresponding vector realization of the relationships encoded by the adjacencymatrix. Such a vector realization only needs to provably exist for this property to hold, which occurs whenever the relationships correspond to distances from some arbitrary metric applied to a latent set of vectors. We therefore completely avoid needing to actually construct vectorial realizations via multidimensional scaling, which ensures that the underlying relationships are totally preserved.
READ FULL TEXT VIEW PDF
Comments
There are no comments yet.