
Neural Operator: Graph Kernel Network for Partial Differential Equations
The classical development of neural networks has been primarily for mapp...
read it

ODEN: A Framework to Solve Ordinary Differential Equations using Artificial Neural Networks
We explore in detail a method to solve ordinary differential equations u...
read it

Programming infinite machines
For infinite machines which are free from the classical Thompson's lamp ...
read it

Homotopy Theoretic and Categorical Models of Neural Information Networks
In this paper we develop a novel mathematical formalism for the modeling...
read it

Deep NeuralKernel Machines
In this chapter we review the main literature related to the recent adva...
read it

Vector Field Neural Networks
This work begins by establishing a mathematical formalization between di...
read it

Do ideas have shape? Plato's theory of forms as the continuous limit of artificial neural networks
We show that ResNets converge, in the infinite depth limit, to a general...
read it
Parametric machines: a fresh approach to architecture search
Using tools from category theory, we provide a framework where artificial neural networks, and their architectures, can be formally described. We first define the notion of machine in a general categorical context, and show how simple machines can be combined into more complex ones. We explore finite and infinitedepth machines, which generalize neural networks and neural ordinary differential equations. Borrowing ideas from functional analysis and kernel methods, we build complete, normed, infinitedimensional spaces of machines, and discuss how to find optimal architectures and parameters – within those spaces – to solve a given computational problem. In our numerical experiments, these kernelinspired networks can outperform classical neural networks when the training dataset is small.
READ FULL TEXT
Comments
There are no comments yet.