DeepAI AI Chat
Log In Sign Up

Randomized Independent Component Analysis

by   Matan Sela, et al.

Independent component analysis (ICA) is a method for recovering statistically independent signals from observations of unknown linear combinations of the sources. Some of the most accurate ICA decomposition methods require searching for the inverse transformation which minimizes different approximations of the Mutual Information, a measure of statistical independence of random vectors. Two such approximations are the Kernel Generalized Variance or the Kernel Canonical Correlation which has been shown to reach the highest performance of ICA methods. However, the computational effort necessary just for computing these measures is cubic in the sample size. Hence, optimizing them becomes even more computationally demanding, in terms of both space and time. Here, we propose a couple of alternative novel measures based on randomized features of the samples - the Randomized Generalized Variance and the Randomized Canonical Correlation. The computational complexity of calculating the proposed alternatives is linear in the sample size and provide a controllable approximation of their Kernel-based non-random versions. We also show that optimization of the proposed statistical properties yields a comparable separation error at an order of magnitude faster compared to Kernel-based measures.


page 1

page 2

page 3

page 4


Notion of information and independent component analysis

Partial orderings and measures of information for continuous univariate ...

Tree-dependent Component Analysis

We present a generalization of independent component analysis (ICA), whe...

Approximate Kernel-based Conditional Independence Tests for Fast Non-Parametric Causal Discovery

Constraint-based causal discovery (CCD) algorithms require fast and accu...

Learning Independent Features with Adversarial Nets for Non-linear ICA

Reliable measures of statistical dependence could be useful tools for le...

Statistical Consistency of Kernel PCA with Random Features

Kernel methods are powerful learning methodologies that provide a simple...

Undercomplete Blind Subspace Deconvolution

We introduce the blind subspace deconvolution (BSSD) problem, which is t...

Independent Component Analysis via Energy-based and Kernel-based Mutual Dependence Measures

We apply both distance-based (Jin and Matteson, 2017) and kernel-based (...