DeepAI AI Chat
Log In Sign Up

Learning Functors using Gradient Descent

by   Bruno Gavranović, et al.

Neural networks are a general framework for differentiable optimization which includes many other machine learning approaches as special cases. In this paper we build a category-theoretic formalism around a neural network system called CycleGAN. CycleGAN is a general approach to unpaired image-to-image translation that has been getting attention in the recent years. Inspired by categorical database systems, we show that CycleGAN is a "schema", i.e. a specific category presented by generators and relations, whose specific parameter instantiations are just set-valued functors on this schema. We show that enforcing cycle-consistencies amounts to enforcing composition invariants in this category. We generalize the learning procedure to arbitrary such categories and show a special class of functors, rather than functions, can be learned using gradient descent. Using this framework we design a novel neural network system capable of learning to insert and delete objects from images without paired data. We qualitatively evaluate the system on the CelebA dataset and obtain promising results.


Compositional Deep Learning

Neural networks have become an increasingly popular tool for solving man...

Optimization Algorithm Inspired Deep Neural Network Structure Design

Deep neural networks have been one of the dominant machine learning appr...

Unsupervised Attention-guided Image to Image Translation

Current unsupervised image-to-image translation techniques struggle to f...

Semi-Supervised Image-to-Image Translation using Latent Space Mapping

Recent image-to-image translation works have been transferred from super...

A framework for overparameterized learning

An explanation for the success of deep neural networks is a central ques...

Generalized Optimization: A First Step Towards Category Theoretic Learning Theory

The Cartesian reverse derivative is a categorical generalization of reve...

Nonparametric Neural Networks

Automatically determining the optimal size of a neural network for a giv...