The QR decomposition for radial neural networks

07/06/2021
by   Iordan Ganev, et al.
0

We provide a theoretical framework for neural networks in terms of the representation theory of quivers, thus revealing symmetries of the parameter space of neural networks. An exploitation of these symmetries leads to a model compression algorithm for radial neural networks based on an analogue of the QR decomposition. A projected version of backpropogation on the original model matches usual backpropogation on the compressed model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2022

Quiver neural networks

We develop a uniform theoretical approach towards the analysis of variou...
research
10/31/2020

Asymptotic Theory of Expectile Neural Networks

Neural networks are becoming an increasingly important tool in applicati...
research
02/07/2019

Radial and Directional Posteriors for Bayesian Neural Networks

We propose a new variational family for Bayesian neural networks. We dec...
research
04/05/2023

On the universal approximation property of radial basis function neural networks

In this paper we consider a new class of RBF (Radial Basis Function) neu...
research
04/13/2023

Physics-informed radial basis network (PIRBN): A local approximation neural network for solving nonlinear PDEs

Our recent intensive study has found that physics-informed neural networ...
research
08/01/2023

Statistical methods for exoplanet detection with radial velocities

Exoplanets can be detected with various observational techniques. Among ...
research
02/12/2020

Rembrandts and Robots: Using Neural Networks to Explore Authorship in Painting

We use convolutional neural networks to analyze authorship questions sur...

Please sign up or login with your details

Forgot password? Click here to reset