Operational calculus on programming spaces

10/25/2016
by   Žiga Sajovic, et al.
0

In this paper we develop operational calculus on programming spaces that generalizes existing approaches to automatic differentiation of computer programs and provides a rigorous framework for program analysis through calculus. We present an abstract computing machine that models automatically differentiable computer programs. Computer programs are viewed as maps on a finite dimensional vector space called virtual memory space, which we extend by the tensor algebra of its dual to accommodate derivatives. The extended virtual memory is by itself an algebra of programs, a data structure one can calculate with, and its elements give the expansion of the original program as an infinite tensor series at program's input values. We define the operator of differentiation on programming spaces and implement a generalized shift operator in terms of its powers. Our approach offers a powerful tool for program analysis and approximation, and provides deep learning with a formal calculus. Such a calculus connects general programs with deep learning through operators that map both formulations to the same space. This equivalence enables a generalization of existing methods for neural analysis to any computer program, and vice versa. Several applications are presented, most notably a meaningful way of neural network initialization that leads to a process of program boosting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/08/2016

Implementing Operational calculus on programming spaces for Differentiable computing

We provide an illustrative implementation of an analytic, infinitely-dif...
research
09/27/2019

Backpropagation in the Simply Typed Lambda-calculus with Linear Negation

Backpropagation is a classic automatic differentiation algorithm computi...
research
12/08/2016

Automatic Differentiation: a look through Tensor and Operational Calculus

In this paper we take a look at Automatic Differentiation through the ey...
research
04/28/2014

Automatic Differentiation of Algorithms for Machine Learning

Automatic differentiation---the mechanical transformation of numeric com...
research
06/24/2021

An implementation of flow calculus for complexity analysis (tool paper)

Abstract. We present a tool to automatically perform the data-size analy...
research
10/07/2020

A Simple and Efficient Tensor Calculus for Machine Learning

Computing derivatives of tensor expressions, also known as tensor calcul...
research
01/11/2018

Review of theory and implementation of hyper-dual numbers for first and second order automatic differentiation

In this review we present hyper-dual numbers as a tool for the automatic...

Please sign up or login with your details

Forgot password? Click here to reset