Neural network layers as parametric spans

08/01/2022
by   Mattia G. Bergomi, et al.
0

Properties such as composability and automatic differentiation made artificial neural networks a pervasive tool in applications. Tackling more challenging problems caused neural networks to progressively become more complex and thus difficult to define from a mathematical perspective. We present a general definition of linear layer arising from a categorical framework based on the notions of integration theory and parametric spans. This definition generalizes and encompasses classical layers (e.g., dense, convolutional), while guaranteeing existence and computability of the layer's derivatives for backpropagation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2022

Machines of finite depth: towards a formalization of neural networks

We provide a unifying framework where artificial neural networks and the...
research
10/05/2019

Minimum "Norm" Neural Networks are Splines

We develop a general framework based on splines to understand the interp...
research
12/31/2012

Training a Functional Link Neural Network Using an Artificial Bee Colony for Solving a Classification Problems

Artificial Neural Networks have emerged as an important tool for classif...
research
07/06/2020

Parametric machines: a fresh approach to architecture search

Using tools from category theory, we provide a framework where artificia...
research
07/02/2018

CoCalc as a Learning Tool for Neural Network Simulation in the Special Course "Foundations of Mathematic Informatics"

The role of neural network modeling in the learning content of the speci...
research
02/17/2023

Highly connected dynamic artificial neural networks

An object-oriented approach to implementing artificial neural networks i...
research
11/28/2019

Two Formal Systems of the λδ Family Revised

We present the framework λδ-2B that significantly improves and generaliz...

Please sign up or login with your details

Forgot password? Click here to reset