Neural Computation of Capacity Region of Memoryless Multiple Access Channels

05/10/2021
by   Farhad Mirkarimi, et al.
0

This paper provides a numerical framework for computing the achievable rate region of memoryless multiple access channel (MAC) with a continuous alphabet from data. In particular, we use recent results on variational lower bounds on mutual information and KL-divergence to compute the boundaries of the rate region of MAC using a set of functions parameterized by neural networks. Our method relies on a variational lower bound on KL-divergence and an upper bound on KL-divergence based on the f-divergence inequalities. Unlike previous work, which computes an estimate on mutual information, which is neither a lower nor an upper bound, our method estimates a lower bound on mutual information. Our numerical results show that the proposed method provides tighter estimates compared to the MINE-based estimator at large SNRs while being computationally more efficient. Finally, we apply the proposed method to the optical intensity MAC and obtain a new achievable rate boundary tighter than prior works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/06/2019

Conditional Mutual Information Neural Estimator

Several recent works in communication systems have proposed to leverage ...
research
12/04/2018

A Tight Upper Bound on Mutual Information

We derive a tight lower bound on equivocation (conditional entropy), or ...
research
11/10/2018

Formal Limitations on the Measurement of Mutual Information

Motivate by applications to unsupervised learning, we consider the probl...
research
06/21/2022

Supermodular f-divergences and bounds on lossy compression and generalization error with mutual f-information

In this paper, we introduce super-modular -divergences and provide three...
research
04/14/2021

Deep Data Density Estimation through Donsker-Varadhan Representation

Estimating the data density is one of the challenging problems in deep l...
research
03/22/2023

Lower Bounds on the Bayesian Risk via Information Measures

This paper focuses on parameter estimation and introduces a new method f...
research
11/16/2020

Regularized Mutual Information Neural Estimation

With the variational lower bound of mutual information (MI), the estimat...

Please sign up or login with your details

Forgot password? Click here to reset