Reducing Parameter Space for Neural Network Training

05/22/2018
by   Tong Qin, et al.
0

For neural networks (NNs) with rectified linear unit (ReLU) or binary activation functions, we show that their training can be accomplished in a reduced parameter space. Specifically, the weights in each neuron can be trained on the unit sphere, as opposed to the entire space, and the threshold can be trained in a bounded interval, as opposed to the real line. We show that the NNs in the reduced parameter space are mathematically equivalent to the standard NNs with parameters in the whole space. The reduced parameter space shall facilitate the optimization procedure for the network training, as the search space becomes (much) smaller. We demonstrate the improved training performance using numerical examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2023

Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks

This paper analyzes representations of continuous piecewise linear funct...
research
07/13/2022

Normalized gradient flow optimization in the training of ReLU artificial neural networks

The training of artificial neural networks (ANNs) is nowadays a highly r...
research
11/10/2016

Computing threshold functions using dendrites

Neurons, modeled as linear threshold unit (LTU), can in theory compute a...
research
04/11/2022

Machine learning based event classification for the energy-differential measurement of the ^natC(n,p) and ^natC(n,d) reactions

The paper explores the feasibility of using machine learning techniques,...
research
06/01/2021

Symmetry-via-Duality: Invariant Neural Network Densities from Parameter-Space Correlators

Parameter-space and function-space provide two different duality frames ...
research
12/07/2021

Estimating Quality of Transmission in a Live Production Network using Machine Learning

We demonstrate QoT estimation in a live network utilizing neural network...

Please sign up or login with your details

Forgot password? Click here to reset