Analysis of the Convergence Speed of the Arimoto-Blahut Algorithm by the Second Order Recurrence Formula

09/17/2020
by   Kenji Nakagawa, et al.
0

In this paper, we investigate the convergence speed of the Arimoto-Blahut algorithm. For many channel matrices the convergence is exponential, but for some channel matrices it is slower than exponential. By analyzing the Taylor expansion of the defining function of the Arimoto-Blahut algorithm, we will make the conditions clear for the exponential or slower convergence. The analysis of the slow convergence is new in this paper. Based on the analysis, we will compare the convergence speed of the Arimoto-Blahut algorithm numerically with the values obtained in our theorems for several channel matrices. The purpose of this paper is a complete understanding of the convergence speed of the Arimoto-Blahut algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2018

Analysis for the Slow Convergence in Arimoto Algorithm

In this paper, we investigate the convergence speed of the Arimoto algor...
research
09/11/2022

On a Proof of the Convergence Speed of a Second-order Recurrence Formula in the Arimoto-Blahut Algorithm

In [8] (Nakagawa, et.al., IEEE Trans. IT, 2021), we investigated the con...
research
02/07/2022

Generalised norm resolvent convergence: comparison of different concepts

In this paper, we show that the two concepts of generalised norm resolve...
research
07/19/2023

Comparative analysis of Jacobi and Gauss-Seidel iterative methods

The paper presents a comparative analysis of iterative numerical methods...
research
08/10/2012

Elimination of ISI Using Improved LMS Based Decision Feedback Equalizer

This paper deals with the implementation of Least Mean Square (LMS) algo...
research
10/05/2021

Exponential confidence region based on the projection density estimate. Recursivity of these estimations

We investigate the famous Tchentzov's projection density statistical est...
research
06/16/2021

Exponential Error Convergence in Data Classification with Optimized Random Features: Acceleration by Quantum Machine Learning

Random features are a central technique for scalable learning algorithms...

Please sign up or login with your details

Forgot password? Click here to reset