Trainable back-propagated functional transfer matrices

10/28/2017
by   Cheng-Hao Cai, et al.
0

Connections between nodes of fully connected neural networks are usually represented by weight matrices. In this article, functional transfer matrices are introduced as alternatives to the weight matrices: Instead of using real weights, a functional transfer matrix uses real functions with trainable parameters to represent connections between nodes. Multiple functional transfer matrices are then stacked together with bias vectors and activations to form deep functional transfer neural networks. These neural networks can be trained within the framework of back-propagation, based on a revision of the delta rules and the error transmission rule for functional connections. In experiments, it is demonstrated that the revised rules can be used to train a range of functional connections: 20 different functions are applied to neural networks with up to 10 hidden layers, and most of them gain high test accuracies on the MNIST database. It is also demonstrated that a functional transfer matrix with a memory function can roughly memorise a non-cyclical sequence of 400 digits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2017

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

Recently low displacement rank (LDR) matrices, or so-called structured m...
research
10/23/2022

Functional Indirection Neural Estimator for Better Out-of-distribution Generalization

The capacity to achieve out-of-distribution (OOD) generalization is a ha...
research
03/14/2019

Tucker Tensor Layer in Fully Connected Neural Networks

We introduce the Tucker Tensor Layer (TTL), an alternative to the dense ...
research
09/22/2015

Tensorizing Neural Networks

Deep neural networks currently demonstrate state-of-the-art performance ...
research
03/06/2020

Finding online neural update rules by learning to remember

We investigate learning of the online local update rules for neural acti...
research
06/15/2020

Finding trainable sparse networks through Neural Tangent Transfer

Deep neural networks have dramatically transformed machine learning, but...
research
04/29/2021

Learning in Feedforward Neural Networks Accelerated by Transfer Entropy

Current neural networks architectures are many times harder to train bec...

Please sign up or login with your details

Forgot password? Click here to reset