A Theoretical Study of The Relationship Between Whole An ELM Network and Its Subnetworks

10/30/2016
by   Enmei Tu, et al.
0

A biological neural network is constituted by numerous subnetworks and modules with different functionalities. For an artificial neural network, the relationship between a network and its subnetworks is also important and useful for both theoretical and algorithmic research, i.e. it can be exploited to develop incremental network training algorithm or parallel network training algorithm. In this paper we explore the relationship between an ELM neural network and its subnetworks. To the best of our knowledge, we are the first to prove a theorem that shows an ELM neural network can be scattered into subnetworks and its optimal solution can be constructed recursively by the optimal solutions of these subnetworks. Based on the theorem we also present two algorithms to train a large ELM neural network efficiently: one is a parallel network training algorithm and the other is an incremental network training algorithm. The experimental results demonstrate the usefulness of the theorem and the validity of the developed algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

Taylorized Training: Towards Better Approximation of Neural Network Training at Finite Width

We propose Taylorized training as an initiative towards better understan...
research
02/23/2021

Histo-fetch – On-the-fly processing of gigapixel whole slide images simplifies and speeds neural network training

We created a custom pipeline (histo-fetch) to efficiently extract random...
research
05/17/2019

Sequential training algorithm for neural networks

A sequential training method for large-scale feedforward neural networks...
research
06/25/2018

Pushing the boundaries of parallel Deep Learning -- A practical approach

This work aims to assess the state of the art of data parallel deep neur...
research
10/03/2022

Optimal consumption-investment choices under wealth-driven risk aversion

CRRA utility where the risk aversion coefficient is a constant is common...
research
11/21/2022

Self-Adaptive, Dynamic, Integrated Statistical and Information Theory Learning

The paper analyses and serves with a positioning of various error measur...
research
08/07/2017

Parallelizing Over Artificial Neural Network Training Runs with Multigrid

Artificial neural networks are a popular and effective machine learning ...

Please sign up or login with your details

Forgot password? Click here to reset