A Mirror Descent Perspective on Classical and Quantum Blahut-Arimoto Algorithms

06/07/2023
by   Kerry He, et al.
0

The Blahut-Arimoto algorithm is a well known method to compute classical channel capacities and rate-distortion functions. Recent works have extended this algorithm to compute various quantum analogs of these quantities. In this paper, we show how these Blahut-Arimoto algorithms are special instances of mirror descent, which is a well-studied generalization of gradient descent for constrained convex optimization. Using new convex analysis tools, we show how relative smoothness and strong convexity analysis recovers known sublinear and linear convergence rates for Blahut-Arimoto algorithms. This mirror descent viewpoint allows us to derive related algorithms with similar convergence guarantees to solve problems in information theory for which Blahut-Arimoto-type algorithms are not directly applicable. We apply this framework to compute energy-constrained classical and quantum channel capacities, classical and quantum rate-distortion functions, and approximations of the relative entropy of entanglement, all with provable convergence guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/13/2021

Minimizing Quantum Renyi Divergences via Mirror Descent with Polyak Step Size

Quantum information quantities play a substantial role in characterizing...
research
03/20/2018

Fastest Rates for Stochastic Mirror Descent Methods

Relative smoothness - a notion introduced by Birnbaum et al. (2011) and ...
research
07/10/2023

Invex Programs: First Order Algorithms and Their Convergence

Invex programs are a special kind of non-convex problems which attain gl...
research
03/02/2023

Quantum Hamiltonian Descent

Gradient descent is a fundamental algorithm in both theory and practice ...
research
11/30/2021

Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization

For strongly convex objectives that are smooth, the classical theory of ...
research
10/13/2022

Noise can be helpful for variational quantum algorithms

Saddle points constitute a crucial challenge for first-order gradient de...
research
06/17/2022

Mirror Descent with Relative Smoothness in Measure Spaces, with application to Sinkhorn and EM

Many problems in machine learning can be formulated as optimizing a conv...

Please sign up or login with your details

Forgot password? Click here to reset