A developmental approach for training deep belief networks

07/12/2022
by   Matteo Zambra, et al.
7

Deep belief networks (DBNs) are stochastic neural networks that can extract rich internal representations of the environment from the sensory data. DBNs had a catalytic effect in triggering the deep learning revolution, demonstrating for the very first time the feasibility of unsupervised learning in networks with many layers of hidden neurons. These hierarchical architectures incorporate plausible biological and cognitive properties, making them particularly appealing as computational models of human perception and cognition. However, learning in DBNs is usually carried out in a greedy, layer-wise fashion, which does not allow to simulate the holistic maturation of cortical circuits and prevents from modeling cognitive development. Here we present iDBN, an iterative learning algorithm for DBNs that allows to jointly update the connection weights across all layers of the model. We evaluate the proposed iterative algorithm on two different sets of visual stimuli, measuring the generative capabilities of the learned model and its potential to support supervised downstream tasks. We also track network development in terms of graph theoretical properties and investigate the potential extension of iDBN to continual learning scenarios. DBNs trained using our iterative approach achieve a final performance comparable to that of the greedy counterparts, at the same time allowing to accurately analyze the gradual development of internal representations in the deep network and the progressive improvement in task performance. Our work paves the way to the use of iDBN for modeling neurocognitive development.

READ FULL TEXT

page 7

page 14

page 18

research
02/27/2022

Robust Continual Learning through a Comprehensively Progressive Bayesian Neural Network

This work proposes a comprehensively progressive Bayesian neural network...
research
02/02/2018

Intriguing Properties of Randomly Weighted Networks: Generalizing While Learning Next to Nothing

Training deep neural networks results in strong learned representations ...
research
06/16/2022

Is Continual Learning Truly Learning Representations Continually?

Continual learning (CL) aims to learn from sequentially arriving tasks w...
research
09/28/2018

Deep learning systems as complex networks

Thanks to the availability of large scale digital datasets and massive a...
research
12/09/2015

Gamma Belief Networks

To infer multilayer deep representations of high-dimensional discrete an...
research
01/06/2018

Design Exploration of Hybrid CMOS-OxRAM Deep Generative Architectures

Deep Learning and its applications have gained tremendous interest recen...

Please sign up or login with your details

Forgot password? Click here to reset