Reverse Derivative Ascent: A Categorical Approach to Learning Boolean Circuits

by   Paul Wilson, et al.

We introduce Reverse Derivative Ascent: a categorical analogue of gradient based methods for machine learning. Our algorithm is defined at the level of so-called reverse differential categories. It can be used to learn the parameters of models which are expressed as morphisms of such categories. Our motivating example is boolean circuits: we show how our algorithm can be applied to such circuits by using the theory of reverse differential categories. Note our methodology allows us to learn the parameters of boolean circuits directly, in contrast to existing binarised neural network approaches. Moreover, we demonstrate its empirical value by giving experimental results on benchmark machine learning datasets.


page 1

page 2

page 3

page 4


Reverse derivative categories

The reverse derivative is a fundamental operation in machine learning an...

Categorical Foundations of Gradient-Based Learning

We propose a categorical foundation of gradient-based machine learning a...

Categories of Differentiable Polynomial Circuits for Machine Learning

Reverse derivative categories (RDCs) have recently been shown to be a su...

Mapping finite state machines to zk-SNARKS Using Category Theory

We provide a categorical procedure to turn graphs corresponding to state...

Monoidal Reverse Differential Categories

Cartesian reverse differential categories (CRDCs) are a recently defined...

Learning Logistic Circuits

This paper proposes a new classification model called logistic circuits....

Reverse Back Propagation to Make Full Use of Derivative

The development of the back-propagation algorithm represents a landmark ...