Information-theoretic analyses of neural data to minimize the effect of researchers' assumptions in predictive coding studies

03/21/2022
by   Patricia Wollstadt, et al.
0

Studies investigating neural information processing often implicitly ask both, which processing strategy out of several alternatives is used and how this strategy is implemented in neural dynamics. A prime example are studies on predictive coding. These often ask if confirmed predictions about inputs or predictions errors between internal predictions and inputs are passed on in a hierarchical neural system-while at the same time looking for the neural correlates of coding for errors and predictions. If we do not know exactly what a neural system predicts at any given moment, this results in a circular analysis-as has been criticized correctly. To circumvent such circular analysis, we propose to express information processing strategies (such as predictive coding) by local information-theoretic quantities, such that they can be estimated directly from neural data. We demonstrate our approach by investigating two opposing accounts of predictive coding-like processing strategies, where we quantify the building blocks of predictive coding, namely predictability of inputs and transfer of information, by local active information storage and local transfer entropy. We define testable hypotheses on the relationship of both quantities to identify which of the assumed strategies was used. We demonstrate our approach on spiking data from the retinogeniculate synapse of the cat. Applying our local information dynamics framework, we are able to show that the synapse codes for predictable rather than surprising input. To support our findings, we apply measures from partial information decomposition, which allow to differentiate if the transferred information is primarily bottom-up sensory input or information transferred conditionally on the current state of the synapse. Supporting our local information-theoretic results, we find that the synapse preferentially transfers bottom-up information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2018

Characterising information-theoretic storage and transfer in continuous time processes

The characterisation of information processing is an important task in c...
research
06/22/2021

A Practical Unified Notation for Information-Theoretic Quantities in ML

Information theory is of importance to machine learning, but the notatio...
research
03/21/2020

On Information Plane Analyses of Neural Network Classifiers – A Review

We review the current literature concerned with information plane analys...
research
04/22/2020

An information-theoretic approach to the analysis of location and co-location patterns

We propose a statistical framework to quantify location and co-location ...
research
12/19/2014

Information-Theoretic Methods for Identifying Relationships among Climate Variables

Information-theoretic quantities, such as entropy, are used to quantify ...
research
02/19/2022

Evaluation of Neuromorphic Spike Encoding of Sound Using Information Theory

The problem of spike encoding of sound consists in transforming a sound ...
research
03/31/2022

Analyzing Wrap-Up Effects through an Information-Theoretic Lens

Numerous analyses of reading time (RT) data have been implemented – all ...

Please sign up or login with your details

Forgot password? Click here to reset