Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

03/04/2019
by   Wentao Huang, et al.
0

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2022

Investigation of Alternative Measures for Mutual Information

Mutual information I(X;Y) is a useful definition in information theory t...
research
04/02/2013

Sparse Signal Processing with Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach

We derive fundamental sample complexity bounds for recovering sparse and...
research
01/11/2017

Modeling Retinal Ganglion Cell Population Activity with Restricted Boltzmann Machines

The retina is a complex nervous system which encodes visual stimuli befo...
research
08/10/2021

Using Information Theory to Measure Psychophysical Performance

Most psychophysical experiments discard half the data collected. Specifi...
research
12/30/2021

Studying the Interplay between Information Loss and Operation Loss in Representations for Classification

Information-theoretic measures have been widely adopted in the design of...
research
11/07/2016

An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax

A framework is presented for unsupervised learning of representations ba...
research
09/13/2016

Information Theoretic Structure Learning with Confidence

Information theoretic measures (e.g. the Kullback Liebler divergence and...

Please sign up or login with your details

Forgot password? Click here to reset