Convexity and Operational Interpretation of the Quantum Information Bottleneck Function

by   Nilanjana Datta, et al.

In classical information theory, the information bottleneck method (IBM) can be regarded as a method of lossy data compression which focusses on preserving meaningful (or relevant) information. As such it has recently gained a lot of attention, primarily for its applications in machine learning and neural networks. A quantum analogue of the IBM has recently been defined, and an attempt at providing an operational interpretation of the so-called quantum IB function as an optimal rate of an information-theoretic task, has recently been made by Salek et al. However, the interpretation given in that paper has a couple of drawbacks; firstly its proof is based on a conjecture that the quantum IB function is convex, and secondly, the expression for the rate function involves certain entropic quantities which occur explicitly in the very definition of the underlying information-theoretic task, thus making the latter somewhat contrived. We overcome both of these drawbacks by first proving the convexity of the quantum IB function, and then giving an alternative operational interpretation of it as the optimal rate of a bona fide information-theoretic task, namely that of quantum source coding with quantum side information at the decoder, and relate the quantum IB function to the rate region of this task. We similarly show that the related privacy funnel function is convex (both in the classical and quantum case). However, we comment that it is unlikely that the quantum privacy funnel function can characterize the optimal asymptotic rate of an information theoretic task, since even its classical version lacks a certain additivity property which turns out to be essential.


page 1

page 2

page 3

page 4


Properties of Noncommutative Renyi and Augustin Information

The scaled Rényi information plays a significant role in evaluating the ...

Quantum Data Compression and Quantum Cross Entropy

Quantum machine learning is an emerging field at the intersection of mac...

An Information-theoretic Progressive Framework for Interpretation

Both brain science and the deep learning communities have the problem of...

Operational Interpretation of the Sandwiched Rényi Divergences of Order 1/2 to 1 as Strong Converse Exponents

We provide the sandwiched Rényi divergence of order α∈(1/2,1), as well a...

Information Theoretic Interpretation of Deep learning

We interpret part of the experimental results of Shwartz-Ziv and Tishby ...

A unified information-theoretic model of EEG signatures of human language processing

We advance an information-theoretic model of human language processing i...

Distributed Quantum Faithful Simulation and Function Computation Using Algebraic Structured Measurements

In this work, we consider the task of faithfully simulating a distribute...

Please sign up or login with your details

Forgot password? Click here to reset