Convolution by Evolution: Differentiable Pattern Producing Networks

06/08/2016
by   Chrisantha Fernando, et al.
0

In this work we introduce a differentiable version of the Compositional Pattern Producing Network, called the DPPN. Unlike a standard CPPN, the topology of a DPPN is evolved but the weights are learned. A Lamarckian algorithm, that combines evolution and learning, produces DPPNs to reconstruct an image. Our main result is that DPPNs can be evolved/trained to compress the weights of a denoising autoencoder from 157684 to roughly 200 parameters, while achieving a reconstruction accuracy comparable to a fully connected network with more than two orders of magnitude more parameters. The regularization ability of the DPPN allows it to rediscover (approximate) convolutional network architectures embedded within a fully connected architecture. Such convolutional architectures are the current state of the art for many computer vision applications, so it is satisfying that DPPNs are capable of discovering this structure rather than having to build it in by design. DPPNs exhibit better generalization when tested on the Omniglot dataset after being trained on MNIST, than directly encoded fully connected autoencoders. DPPNs are therefore a new framework for integrating learning and evolution.

READ FULL TEXT

page 5

page 6

page 7

research
11/09/2015

How far can we go without convolution: Improving fully-connected networks

We propose ways to improve the performance of fully connected networks. ...
research
07/27/2020

Towards Learning Convolutions from Scratch

Convolution is one of the most essential components of architectures use...
research
05/30/2019

DeepShift: Towards Multiplication-Less Neural Networks

Deep learning models, especially DCNN have obtained high accuracies in s...
research
10/11/2022

The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes

Convolutional neural networks were the standard for solving many compute...
research
10/16/2020

Why Are Convolutional Nets More Sample-Efficient than Fully-Connected Nets?

Convolutional neural networks often dominate fully-connected counterpart...
research
01/23/2018

Numerical Coordinate Regression with Convolutional Neural Networks

We study deep learning approaches to inferring numerical coordinates for...
research
02/13/2019

Identity Crisis: Memorization and Generalization under Extreme Overparameterization

We study the interplay between memorization and generalization of overpa...

Please sign up or login with your details

Forgot password? Click here to reset