Approximate Simultaneous Diagonalization of Matrices via Structured Low-Rank Approximation

10/13/2020
by   Riku Akema, et al.
0

Approximate Simultaneous Diagonalization (ASD) is a problem to find a common similarity transformation which approximately diagonalizes a given square-matrix tuple. Many data science problems have been reduced into ASD through ingenious modelling. For ASD, the so-called Jacobi-like methods have been extensively used. However, the methods have no guarantee to suppress the magnitude of off-diagonal entries of the transformed tuple even if the given tuple has a common exact diagonalizer, i.e., the given tuple is simultaneously diagonalizable. In this paper, to establish an alternative powerful strategy for ASD, we present a novel two-step strategy, called Approximate-Then-Diagonalize-Simultaneously (ATDS) algorithm. The ATDS algorithm decomposes ASD into (Step 1) finding a simultaneously diagonalizable tuple near the given one; and (Step 2) finding a common similarity transformation which diagonalizes exactly the tuple obtained in Step 1. The proposed approach to Step 1 is realized by solving a Structured Low-Rank Approximation (SLRA) with Cadzow's algorithm. In Step 2, by exploiting the idea in the constructive proof regarding the conditions for the exact simultaneous diagonalizability, we obtain a common exact diagonalizer of the obtained tuple in Step 1 as a solution for the original ASD. Unlike the Jacobi-like methods, the ATDS algorithm has a guarantee to find a common exact diagonalizer if the given tuple happens to be simultaneously diagonalizable. Numerical experiments show that the ATDS algorithm achieves better performance than the Jacobi-like methods.

READ FULL TEXT
research
02/16/2020

A gradient system approach for Hankel structured low-rank approximation

Rank deficient Hankel matrices are at the core of several applications. ...
research
05/27/2020

Simultaneous Diagonalization of Incomplete Matrices and Applications

We consider the problem of recovering the entries of diagonal matrices {...
research
12/17/2020

Rank-One Measurements of Low-Rank PSD Matrices Have Small Feasible Sets

We study the role of the constraint set in determining the solution to l...
research
10/19/2019

Simultaneous hollowisation, joint numerical range, and stabilization by noise

We consider orthogonal transformations of arbitrary square matrices to a...
research
03/01/2023

Transformed Low-Rank Parameterization Can Help Robust Generalization for Tensor Neural Networks

Achieving efficient and robust multi-channel data learning is a challeng...
research
12/05/2019

KoPA: Automated Kronecker Product Approximation

We consider matrix approximation induced by the Kronecker product decomp...

Please sign up or login with your details

Forgot password? Click here to reset