Separable Gaussian Neural Networks: Structure, Analysis, and Function Approximations

08/13/2023
by   Siyuan Xing, et al.
0

The Gaussian-radial-basis function neural network (GRBFNN) has been a popular choice for interpolation and classification. However, it is computationally intensive when the dimension of the input vector is high. To address this issue, we propose a new feedforward network - Separable Gaussian Neural Network (SGNN) by taking advantage of the separable property of Gaussian functions, which splits input data into multiple columns and sequentially feeds them into parallel layers formed by uni-variate Gaussian functions. This structure reduces the number of neurons from O(N^d) of GRBFNN to O(dN), which exponentially improves the computational speed of SGNN and makes it scale linearly as the input dimension increases. In addition, SGNN can preserve the dominant subspace of the Hessian matrix of GRBFNN in gradient descent training, leading to a similar level of accuracy to GRBFNN. It is experimentally demonstrated that SGNN can achieve 100 times speedup with a similar level of accuracy over GRBFNN on tri-variate function approximations. The SGNN also has better trainability and is more tuning-friendly than DNNs with RuLU and Sigmoid functions. For approximating functions with complex geometry, SGNN can lead to three orders of magnitude more accurate results than a RuLU-DNN with twice the number of layers and the number of neurons per layer.

READ FULL TEXT

page 12

page 13

page 14

page 15

page 16

page 19

research
11/09/2018

Gradient Descent Finds Global Minima of Deep Neural Networks

Gradient descent finds a global minimum in training deep neural networks...
research
03/09/2023

Provable Data Subset Selection For Efficient Neural Network Training

Radial basis function neural networks (RBFNN) are well-known for their c...
research
04/08/2021

A single gradient step finds adversarial examples on random two-layers neural networks

Daniely and Schacham recently showed that gradient descent finds adversa...
research
06/15/2020

Feature Space Saturation during Training

We propose layer saturation - a simple, online-computable method for ana...
research
10/02/2020

Rotated Ring, Radial and Depth Wise Separable Radial Convolutions

Simple image rotations significantly reduce the accuracy of deep neural ...
research
04/01/2019

Benchmarking Approximate Inference Methods for Neural Structured Prediction

Exact structured inference with neural network scoring functions is comp...
research
04/24/2018

Opening the black box of neural nets: case studies in stop/top discrimination

We introduce techniques for exploring the functionality of a neural netw...

Please sign up or login with your details

Forgot password? Click here to reset