Parametrization of Neural Networks with Connected Abelian Lie Groups as Data Manifold

04/06/2020
by   Luciano Melodia, et al.
0

Neural nets have been used in an elusive number of scientific disciplines. Nevertheless, their parameterization is largely unexplored. Dense nets are the coordinate transformations of a manifold from which the data is sampled. After processing through a layer, the representation of the original manifold may change. This is crucial for the preservation of its topological structure and should therefore be parameterized correctly. We discuss a method to determine the smallest topology preserving layer considering the data domain as abelian connected Lie group and observe that it is decomposable into R^p ×T^q. Persistent homology allows us to count its k-th homology groups. Using Künneth's theorem, we count the k-th Betti numbers. Since we know the embedding dimension of R^p and S^1, we parameterize the bottleneck layer with the smallest possible matrix group, which can represent a manifold with those homology groups. Resnets guarantee smaller embeddings due to the dimension of their state space representation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro