DeepAI AI Chat
Log In Sign Up

Learning Independent Features with Adversarial Nets for Non-linear ICA

10/13/2017
by   Philemon Brakel, et al.
0

Reliable measures of statistical dependence could be useful tools for learning independent features and performing tasks like source separation using Independent Component Analysis (ICA). Unfortunately, many of such measures, like the mutual information, are hard to estimate and optimize directly. We propose to learn independent features with adversarial objectives which optimize such measures implicitly. These objectives compare samples from the joint distribution and the product of the marginals without the need to compute any probability densities. We also propose two methods for obtaining samples from the product of the marginals using either a simple resampling trick or a separate parametric distribution. Our experiments show that this strategy can easily be applied to different types of model architectures and solve both linear and non-linear ICA problems.

READ FULL TEXT
03/01/2019

Non-linear ICA based on Cramer-Wold metric

Non-linear source separation is a challenging open problem with many app...
12/01/2018

Learning Speaker Representations with Mutual Information

Learning good representations is of crucial importance in deep learning....
06/19/2020

Notion of information and independent component analysis

Partial orderings and measures of information for continuous univariate ...
06/01/2015

Mutual Dependence: A Novel Method for Computing Dependencies Between Random Variables

In data science, it is often required to estimate dependencies between d...
09/22/2016

Randomized Independent Component Analysis

Independent component analysis (ICA) is a method for recovering statisti...
08/14/2019

Interleaved Multitask Learning for Audio Source Separation with Independent Databases

Deep Neural Network-based source separation methods usually train indepe...

Code Repositories

anica

Adversarial Non-linear Independent Component Analysis


view repo