Performance Analysis of the Gaussian Quasi-Maximum Likelihood Approach for Independent Vector Analysis

08/30/2020
by   Amir Weiss, et al.
0

Maximum Likelihood (ML) estimation requires precise knowledge of the underlying statistical model. In Quasi ML (QML), a presumed model is used as a substitute to the (unknown) true model. In the context of Independent Vector Analysis (IVA), we consider the Gaussian QML Estimate (QMLE) of the demixing matrices set and present an (approximate) analysis of its asymptotic separation performance. In Gaussian QML the sources are presumed to be Gaussian, with covariance matrices specified by some "educated guess". The resulting quasi-likelihood equations of the demixing matrices take a special form, recently termed an extended "Sequentially Drilled" Joint Congruence (SeDJoCo) transformation, which is reminiscent of (though essentially different from) classical joint diagonalization. We show that asymptotically this QMLE, i.e., the solution of the resulting extended SeDJoCo transformation, attains perfect separation (under some mild conditions) regardless of the sources' true distributions and/or covariance matrices. In addition, based on the "small-errors" assumption, we present a first-order perturbation analysis of the extended SeDJoCo solution. Using the resulting closed-form expressions for the errors in the solution matrices, we provide closed-form expressions for the resulting Interference-to-Source Ratios (ISRs) for IVA. Moreover, we prove that asymptotically the ISRs depend only on the sources' covariances, and not on their specific distributions. As an immediate consequence of this result, we provide an asymptotically attainable lower bound on the resulting ISRs. We also present empirical results, corroborating our analytical derivations, of three simulation experiments concerning two possible model errors - inaccurate covariance matrices and sources' distribution mismodeling.

READ FULL TEXT

page 1

page 9

research
08/30/2020

The Extended "Sequentially Drilled" Joint Congruence Transformation and its Application in Gaussian Independent Vector Analysis

Independent Vector Analysis (IVA) has emerged in recent years as an exte...
research
10/22/2018

A Maximum Likelihood-Based Minimum Mean Square Error Separation and Estimation of Stationary Gaussian Sources from Noisy Mixtures

In the context of Independent Component Analysis (ICA), noisy mixtures p...
research
07/19/2023

Asymptotic equivalence of Principal Components and Quasi Maximum Likelihood estimators in Large Approximate Factor Models

This paper investigates the properties of Quasi Maximum Likelihood estim...
research
05/15/2022

Covariance Model with General Linear Structure and Divergent Parameters

For estimating the large covariance matrix with a limited sample size, w...
research
02/26/2019

Brownian motion tree models are toric

Felsenstein's classical model for Gaussian distributions on a phylogenet...
research
06/18/2013

Group Symmetry and non-Gaussian Covariance Estimation

We consider robust covariance estimation with group symmetry constraints...
research
08/18/2020

On the Error Exponent of Approximate Sufficient Statistics for M-ary Hypothesis Testing

Consider the problem of detecting one of M i.i.d. Gaussian signals corru...

Please sign up or login with your details

Forgot password? Click here to reset