Neural networks adapting to datasets: learning network size and topology

06/22/2020
by   Romuald A. Janik, et al.
0

We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a standard gradient-based training. The resulting network has the structure of a graph tailored to the particular learning task and dataset. The obtained networks can also be trained from scratch and achieve virtually identical performance. We explore the properties of the network architectures for a number of datasets of varying difficulty observing systematic regularities. The obtained graphs can be therefore understood as encoding nontrivial characteristics of the particular classification tasks.

READ FULL TEXT
research
07/24/2019

Knowledge transfer in deep block-modular neural networks

Although deep neural networks (DNNs) have demonstrated impressive result...
research
06/20/2016

Neural networks with differentiable structure

While gradient descent has proven highly successful in learning connecti...
research
07/03/2019

Neural Network Architecture Search with Differentiable Cartesian Genetic Programming for Regression

The ability to design complex neural network architectures which enable ...
research
06/14/2018

Insights on representational similarity in neural networks with canonical correlation

Comparing different neural network representations and determining how r...
research
10/05/2022

Dynamical systems' based neural networks

Neural networks have gained much interest because of their effectiveness...
research
05/11/2020

Ring Reservoir Neural Networks for Graphs

Machine Learning for graphs is nowadays a research topic of consolidated...
research
07/11/2018

Morse Code Datasets for Machine Learning

We present an algorithm to generate synthetic datasets of tunable diffic...

Please sign up or login with your details

Forgot password? Click here to reset