Deep Convolutional Neural Networks with Unitary Weights

02/23/2021
by   Hao-Yuan Chang, et al.
96

While normalizations aim to fix the exploding and vanishing gradient problem in deep neural networks, they have drawbacks in speed or accuracy because of their dependency on the data set statistics. This work is a comprehensive study of a novel method based on unitary synaptic weights derived from Lie Group to construct intrinsically stable neural systems. Here we show that unitary convolutional neural networks deliver up to 32 maintaining competitive prediction accuracy. Unlike prior arts restricted to square synaptic weights, we expand the unitary networks to weights of any size and dimension.

READ FULL TEXT

page 8

page 10

page 16

page 18

research
06/15/2018

Bayesian Convolutional Neural Networks

We propose a Bayesian convolutional neural network built upon Bayes by B...
research
10/27/2022

On the biological plausibility of orthogonal initialisation for solving gradient instability in deep neural networks

Initialising the synaptic weights of artificial neural networks (ANNs) w...
research
02/26/2020

Predicting Neural Network Accuracy from Weights

We study the prediction of the accuracy of a neural network given only i...
research
09/25/2018

Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks

Incorporation of a new knowledge into neural networks with simultaneous ...
research
10/26/2017

On the role of synaptic stochasticity in training low-precision neural networks

Stochasticity and limited precision of synaptic weights in neural networ...
research
02/19/2021

A Projection Algorithm for the Unitary Weights

Unitary neural networks are promising alternatives for solving the explo...
research
10/07/2013

Mean Field Bayes Backpropagation: scalable training of multilayer neural networks with binary weights

Significant success has been reported recently using deep neural network...

Please sign up or login with your details

Forgot password? Click here to reset