Linear Algebra and Duality of Neural Networks

09/12/2018
by   Galin Georgiev, et al.
0

Natural for Neural networks bases, mappings, projections and metrics are built. Graph-theoretical interpretation is offered. Non-Gaussianity naturally emerges, even in relatively simple datasets. Training statistics and hierarchies are discussed, from physics point of view. Relationship between exact and numerical solutions is looked into. Duality between observables and observations is established. Examples support all new concepts.

READ FULL TEXT

page 8

page 9

page 16

page 17

page 28

page 29

page 37

research
12/11/2019

Bitopological Duality for Algebras of Fittings logic and Natural Duality extension

In this paper, we investigate a bitopological duality for algebras of Fi...
research
10/13/2021

Parallel Deep Neural Networks Have Zero Duality Gap

Training deep neural networks is a well-known highly non-convex problem....
research
11/18/2020

An effective method for computing Grothendieck point residue mappings

Grothendieck point residue is considered in the context of computational...
research
11/27/2020

A Galois Connection Approach to Wei-Type Duality Theorems

In 1991, Wei proved a duality theorem that established an interesting co...
research
07/28/2017

Fractions, Projective Representation, Duality, Linear Algebra and Geometry

This contribution describes relationship between fractions, projective r...
research
02/18/2016

Toward Deeper Understanding of Neural Networks: The Power of Initialization and a Dual View on Expressivity

We develop a general duality between neural networks and compositional k...
research
07/17/2019

Solving Systems of Linear Inequalities

Dantzig and Eaves claimed that fundamental duality theorems of linear pr...

Please sign up or login with your details

Forgot password? Click here to reset