Towards universal neural nets: Gibbs machines and ACE

08/26/2015
by   Galin Georgiev, et al.
0

We study from a physics viewpoint a class of generative neural nets, Gibbs machines, designed for gradual learning. While including variational auto-encoders, they offer a broader universal platform for incrementally adding newly learned features, including physical symmetries. Their direct connection to statistical physics and information geometry is established. A variational Pythagorean theorem justifies invoking the exponential/Gibbs class of probabilities for creating brand new objects. Combining these nets with classifiers, gives rise to a brand of universal generative neural nets - stochastic auto-classifier-encoders (ACE). ACE have state-of-the-art performance in their class, both for classification and density estimation for the MNIST data set.

READ FULL TEXT
research
11/09/2015

Symmetries and control in generative neural nets

We study generative nets which can control and modify observations, afte...
research
10/05/2020

Self-Supervised Variational Auto-Encoders

Density estimation, compression and data generation are crucial tasks in...
research
01/21/2019

On Compression of Unsupervised Neural Nets by Pruning Weak Connections

Unsupervised neural nets such as Restricted Boltzmann Machines(RBMs) and...
research
07/29/2023

Discrete neural nets and polymorphic learning

Theorems from universal algebra such as that of Murskiĭ from the 1970s h...
research
06/26/2019

Chaining Meets Chain Rule: Multilevel Entropic Regularization and Training of Neural Nets

We derive generalization and excess risk bounds for neural nets using a ...
research
06/02/2022

Deep Learning on Implicit Neural Datasets

Implicit neural representations (INRs) have become fast, lightweight too...
research
12/20/2017

Wadge Degrees of ω-Languages of Petri Nets

We prove that ω-languages of (non-deterministic) Petri nets and ω-langua...

Please sign up or login with your details

Forgot password? Click here to reset