DeepAI AI Chat
Log In Sign Up

Generalized Bregman Divergence and Gradient of Mutual Information for Vector Poisson Channels

01/28/2013
by   Liming Wang, et al.
Duke University
UCL
0

We investigate connections between information-theoretic and estimation-theoretic quantities in vector Poisson channel models. In particular, we generalize the gradient of mutual information with respect to key system parameters from the scalar to the vector Poisson channel model. We also propose, as another contribution, a generalization of the classical Bregman divergence that offers a means to encapsulate under a unifying framework the gradient of mutual information results for scalar and vector Poisson and Gaussian channel models. The so-called generalized Bregman divergence is also shown to exhibit various properties akin to the properties of the classical version. The vector Poisson channel model is drawing considerable attention in view of its application in various domains: as an example, the availability of the gradient of mutual information can be used in conjunction with gradient descent methods to effect compressive-sensing projection designs in emerging X-ray and document classification applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/21/2022

Heuristic Sensing Schemes for Four-Target Detection in Time-Constrained Vector Poisson and Gaussian Channels

In this work, we investigate the different sensing schemes for the detec...
11/26/2018

Divergence radii and the strong converse exponent of classical-quantum channel coding with constant compositions

There are different inequivalent ways to define the Rényi mutual informa...
02/11/2021

Fisher Information and Mutual Information Constraints

We consider the processing of statistical samples X∼ P_θ by a channel p(...
05/11/2023

Computing Unique Information for Poisson and Multinomial Systems

Bivariate Partial Information Decomposition (PID) describes how the mutu...
01/19/2022

Sensing Method for Two-Target Detection in Time-Constrained Vector Poisson Channel

It is an experimental design problem in which there are two Poisson sour...
12/11/2019

End-to-End Learning of Geometrical Shaping Maximizing Generalized Mutual Information

GMI-based end-to-end learning is shown to be highly nonconvex. We apply ...
04/20/2018

Inter-Annotator Agreement Networks

This work develops a simple information theoretic framework that capture...