Neural networks with trainable matrix activation functions

09/21/2021
by   Yuwen Li, et al.
0

The training process of neural networks usually optimize weights and bias parameters of linear transformations, while nonlinear activation functions are pre-specified and fixed. This work develops a systematic approach to constructing matrix activation functions whose entries are generalized from ReLU. The activation is based on matrix-vector multiplications using only scalar multiplications and comparisons. The proposed activation functions depend on parameters that are trained along with the weights and bias vectors. Neural networks based on this approach are simple and efficient and are shown to be robust in numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2021

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks

We have proposed orthogonal-Padé activation functions, which are trainab...
research
08/07/2022

Transmission Neural Networks: From Virus Spread Models to Neural Networks

This work connects models for virus spread on networks with their equiva...
research
12/02/2019

A Random Matrix Perspective on Mixtures of Nonlinearities for Deep Learning

One of the distinguishing characteristics of modern deep learning system...
research
05/03/2018

Lifted Neural Networks

We describe a novel family of models of multi- layer feedforward neural ...
research
04/25/2022

Trainable Compound Activation Functions for Machine Learning

Activation functions (AF) are necessary components of neural networks th...
research
07/11/2023

Using Linear Regression for Iteratively Training Neural Networks

We present a simple linear regression based approach for learning the we...
research
06/30/2019

Robust and Resource Efficient Identification of Two Hidden Layer Neural Networks

We address the structure identification and the uniform approximation of...

Please sign up or login with your details

Forgot password? Click here to reset