Deep Learning with a Classifier System: Initial Results

03/01/2021
by   Richard J. Preen, et al.
0

This article presents the first results from using a learning classifier system capable of performing adaptive computation with deep neural networks. Individual classifiers within the population are composed of two neural networks. The first acts as a gating or guarding component, which enables the conditional computation of an associated deep neural network on a per instance basis. Self-adaptive mutation is applied upon reproduction and prediction networks are refined with stochastic gradient descent during lifetime learning. The use of fully-connected and convolutional layers are evaluated on handwritten digit recognition tasks where evolution adapts (i) the gradient descent learning rate applied to each layer (ii) the number of units within each layer, i.e., the number of fully-connected neurons and the number of convolutional kernel filters (iii) the connectivity of each layer, i.e., whether each weight is active (iv) the weight magnitudes, enabling escape from local optima. The system automatically reduces the number of weights and units while maintaining performance after achieving a maximum prediction error.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2019

Autoencoding with XCSF

Autoencoders enable data dimensionality reduction and are a key componen...
research
08/30/2022

On the universal consistency of an over-parametrized deep neural network estimate learned by gradient descent

Estimation of a multivariate regression function from independent and id...
research
03/26/2021

Explore the Knowledge contained in Network Weights to Obtain Sparse Neural Networks

Sparse neural networks are important for achieving better generalization...
research
02/07/2019

Combining learning rate decay and weight decay with complexity gradient descent - Part I

The role of L^2 regularization, in the specific case of deep neural netw...
research
01/06/2023

Grokking modular arithmetic

We present a simple neural network that can learn modular arithmetic tas...
research
06/20/2019

Homogeneous Vector Capsules Enable Adaptive Gradient Descent in Convolutional Neural Networks

Capsules are the name given by Geoffrey Hinton to vector-valued neurons....
research
11/16/2018

Residual Convolutional Neural Network Revisited with Active Weighted Mapping

In visual recognition, the key to the performance improvement of ResNet ...

Please sign up or login with your details

Forgot password? Click here to reset