Learning Functors using Gradient Descent

09/15/2020
by   Bruno Gavranović, et al.
0

Neural networks are a general framework for differentiable optimization which includes many other machine learning approaches as special cases. In this paper we build a category-theoretic formalism around a neural network system called CycleGAN. CycleGAN is a general approach to unpaired image-to-image translation that has been getting attention in the recent years. Inspired by categorical database systems, we show that CycleGAN is a "schema", i.e. a specific category presented by generators and relations, whose specific parameter instantiations are just set-valued functors on this schema. We show that enforcing cycle-consistencies amounts to enforcing composition invariants in this category. We generalize the learning procedure to arbitrary such categories and show a special class of functors, rather than functions, can be learned using gradient descent. Using this framework we design a novel neural network system capable of learning to insert and delete objects from images without paired data. We qualitatively evaluate the system on the CelebA dataset and obtain promising results.

READ FULL TEXT
research
07/16/2019

Compositional Deep Learning

Neural networks have become an increasingly popular tool for solving man...
research
10/03/2018

Optimization Algorithm Inspired Deep Neural Network Structure Design

Deep neural networks have been one of the dominant machine learning appr...
research
06/06/2018

Unsupervised Attention-guided Image to Image Translation

Current unsupervised image-to-image translation techniques struggle to f...
research
03/29/2022

Semi-Supervised Image-to-Image Translation using Latent Space Mapping

Recent image-to-image translation works have been transferred from super...
research
05/26/2022

A framework for overparameterized learning

An explanation for the success of deep neural networks is a central ques...
research
09/20/2021

Generalized Optimization: A First Step Towards Category Theoretic Learning Theory

The Cartesian reverse derivative is a categorical generalization of reve...
research
12/14/2017

Nonparametric Neural Networks

Automatically determining the optimal size of a neural network for a giv...

Please sign up or login with your details

Forgot password? Click here to reset