The distance between the weights of the neural network is meaningful

01/31/2021
by   Liqun Yang, et al.
0

In the application of neural networks, we need to select a suitable model based on the problem complexity and the dataset scale. To analyze the network's capacity, quantifying the information learned by the network is necessary. This paper proves that the distance between the neural network weights in different training stages can be used to estimate the information accumulated by the network in the training process directly. The experiment results verify the utility of this method. An application of this method related to the label corruption is shown at the end.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/09/2014

Radial basis function process neural network training based on generalized frechet distance and GA-SA hybrid strategy

For learning problem of Radial Basis Function Process Neural Network (RB...
research
11/05/2020

Teaching with Commentaries

Effective training of deep neural networks can be challenging, and there...
research
03/04/2021

Clusterability in Neural Networks

The learned weights of a neural network have often been considered devoi...
research
07/03/2020

Learning to Prune in Training via Dynamic Channel Propagation

In this paper, we propose a novel network training mechanism called "dyn...
research
12/25/2020

Neural Network Training With Homomorphic Encryption

We introduce a novel method and implementation architecture to train neu...
research
02/23/2021

Histo-fetch – On-the-fly processing of gigapixel whole slide images simplifies and speeds neural network training

We created a custom pipeline (histo-fetch) to efficiently extract random...
research
09/12/2023

Epistemic Modeling Uncertainty of Rapid Neural Network Ensembles for Adaptive Learning

Emulator embedded neural networks, which are a type of physics informed ...

Please sign up or login with your details

Forgot password? Click here to reset