Linear Noise Approximation of Intensity-Driven Signal Transduction Channels

08/28/2019
by   Gregory R. Hessler, et al.
0

Biochemical signal transduction, a form of molecular communication, can be modeled using graphical Markov channels with input-modulated transition rates. Such channel models are strongly non-Gaussian. In this paper we use a linear noise approximation to construct a novel class of Gaussian additive white noise channels that capture essential features of fully- and partially-observed intensity-driven signal transduction. When channel state transitions that are sensitive to the input signal are directly observable, high-frequency information is transduced more efficiently than low-frequency information, and the mutual information rate per bandwidth (spectral efficiency) is significantly greater than when sensitive transitions and observable transitions are disjoint. When both observable and hidden transitions are input-sensitive, we observe a superadditive increase in spectral efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2023

Mutual Information Rate of Gaussian and Truncated Gaussian Inputs on Intensity-Driven Signal Transduction Channels

In this letter, we investigate the mutual information rate (MIR) achieve...
research
04/12/2018

The Channel Capacity of Channelrhodopsin and Other Intensity-Driven Signal Transduction Receptors

Biological systems transduce signals from their surroundings through a m...
research
03/04/2023

Derivatives of mutual information in Gaussian channels

We derive a general formula for the derivatives of mutual information be...
research
02/06/2023

Information Rates for Channels with Fading, Side Information and Adaptive Codewords

Generalized mutual information (GMI) is used to compute achievable rates...
research
04/16/2021

Mismatched Models to Lower Bound the Capacity of Dual-Polarization Optical Fiber Channels

Regular perturbation is applied to the Manakov equation and motivates a ...
research
04/11/2023

Breakdown of a concavity property of mutual information for non-Gaussian channels

Let S and S̃ be two independent and identically distributed random varia...

Please sign up or login with your details

Forgot password? Click here to reset