Learning gradient-based ICA by neurally estimating mutual information

04/22/2019
by   Hlynur Davíð Hlynsson, et al.
0

Several methods of estimating the mutual information of random variables have been developed in recent years. They can prove valuable for novel approaches to learning statistically independent features. In this paper, we use one of these methods, a mutual information neural estimation (MINE) network, to present a proof-of-concept of how a neural network can perform linear ICA. We minimize the mutual information, as estimated by a MINE network, between the output units of a differentiable encoder network. This is done by simple alternate optimization of the two networks. The method is shown to get a qualitatively equal solution to FastICA on blind-source-separation of noisy sources.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset