Learning in Feedforward Neural Networks Accelerated by Transfer Entropy

04/29/2021
by   Adrian Moldovan, et al.
0

Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2017

Transfer entropy-based feedback improves performance in artificial neural networks

The structure of the majority of modern deep neural networks is characte...
research
10/26/2020

Local Granger Causality

Granger causality is a statistical notion of causal influence based on p...
research
02/01/2020

Variable-lag Granger Causality and Transfer Entropy for Time Series Analysis

Granger causality is a fundamental technique for causal inference in tim...
research
03/22/2022

Causal inference in time series in terms of Rényi transfer entropy

Uncovering causal interdependencies from observational data is one of th...
research
01/22/2019

Can Transfer Entropy Infer Causality in Neuronal Circuits for Cognitive Processing?

Finding the causes to observed effects and establishing causal relations...
research
10/28/2017

Trainable back-propagated functional transfer matrices

Connections between nodes of fully connected neural networks are usually...
research
12/07/2021

Testing for Causal Influence using a Partial Coherence Statistic

In this paper we explore partial coherence as a tool for evaluating caus...

Please sign up or login with your details

Forgot password? Click here to reset