Towards universal neural nets: Gibbs machines and ACE

08/26/2015
by   Galin Georgiev, et al.
0

We study from a physics viewpoint a class of generative neural nets, Gibbs machines, designed for gradual learning. While including variational auto-encoders, they offer a broader universal platform for incrementally adding newly learned features, including physical symmetries. Their direct connection to statistical physics and information geometry is established. A variational Pythagorean theorem justifies invoking the exponential/Gibbs class of probabilities for creating brand new objects. Combining these nets with classifiers, gives rise to a brand of universal generative neural nets - stochastic auto-classifier-encoders (ACE). ACE have state-of-the-art performance in their class, both for classification and density estimation for the MNIST data set.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro