DeepAI AI Chat
Log In Sign Up

Learning Independent Features with Adversarial Nets for Non-linear ICA

by   Philemon Brakel, et al.

Reliable measures of statistical dependence could be useful tools for learning independent features and performing tasks like source separation using Independent Component Analysis (ICA). Unfortunately, many of such measures, like the mutual information, are hard to estimate and optimize directly. We propose to learn independent features with adversarial objectives which optimize such measures implicitly. These objectives compare samples from the joint distribution and the product of the marginals without the need to compute any probability densities. We also propose two methods for obtaining samples from the product of the marginals using either a simple resampling trick or a separate parametric distribution. Our experiments show that this strategy can easily be applied to different types of model architectures and solve both linear and non-linear ICA problems.


Non-linear ICA based on Cramer-Wold metric

Non-linear source separation is a challenging open problem with many app...

Learning Speaker Representations with Mutual Information

Learning good representations is of crucial importance in deep learning....

Notion of information and independent component analysis

Partial orderings and measures of information for continuous univariate ...

Mutual Dependence: A Novel Method for Computing Dependencies Between Random Variables

In data science, it is often required to estimate dependencies between d...

Randomized Independent Component Analysis

Independent component analysis (ICA) is a method for recovering statisti...

Interleaved Multitask Learning for Audio Source Separation with Independent Databases

Deep Neural Network-based source separation methods usually train indepe...

Code Repositories


Adversarial Non-linear Independent Component Analysis

view repo