Neural Estimation and Optimization of Directed Information over Continuous Spaces

03/28/2022
by   Dor Tsur, et al.
0

This work develops a new method for estimating and optimizing the directed information rate between two jointly stationary and ergodic stochastic processes. Building upon recent advances in machine learning, we propose a recurrent neural network (RNN)-based estimator which is optimized via gradient ascent over the RNN parameters. The estimator does not require prior knowledge of the underlying joint and marginal distributions. The estimator is also readily optimized over continuous input processes realized by a deep generative model. We prove consistency of the proposed estimation and optimization methods and combine them to obtain end-to-end performance guarantees. Applications for channel capacity estimation of continuous channels with memory are explored, and empirical results demonstrating the scalability and accuracy of our method are provided. When the channel is memoryless, we investigate the mapping learned by the optimized input generator.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2023

Data-Driven Optimization of Directed Information over Discrete Alphabets

Directed information (DI) is a fundamental measure for the study and ana...
research
03/09/2020

Capacity of Continuous Channels with Memory via Directed Information Neural Estimator

Calculating the capacity (with or without feedback) of channels with mem...
research
03/22/2022

A Perspective on Neural Capacity Estimation: Viability and Reliability

Recently, several methods have been proposed for estimating the mutual i...
research
12/11/2019

Concept and Experimental Demonstration of Optical IM/DD End-to-End System Optimization using a Generative Model

We perform an experimental end-to-end transceiver optimization via deep ...
research
06/08/2020

Theoretical Guarantees for Learning Conditional Expectation using Controlled ODE-RNN

Continuous stochastic processes are widely used to model time series tha...
research
01/27/2023

DAG Learning on the Permutahedron

We propose a continuous optimization framework for discovering a latent ...

Please sign up or login with your details

Forgot password? Click here to reset