StochasticNet: Forming Deep Neural Networks via Stochastic Connectivity

08/22/2015
by   Mohammad Javad Shafiee, et al.
0

Deep neural networks is a branch in machine learning that has seen a meteoric rise in popularity due to its powerful abilities to represent and model high-level abstractions in highly complex data. One area in deep neural networks that is ripe for exploration is neural connectivity formation. A pivotal study on the brain tissue of rats found that synaptic formation for specific functional connectivity in neocortical neural microcircuits can be surprisingly well modeled and predicted as a random formation. Motivated by this intriguing finding, we introduce the concept of StochasticNet, where deep neural networks are formed via stochastic connectivity between neurons. As a result, any type of deep neural networks can be formed as a StochasticNet by allowing the neuron connectivity to be stochastic. Stochastic synaptic formations, in a deep neural network architecture, can allow for efficient utilization of neurons for performing specific tasks. To evaluate the feasibility of such a deep neural network architecture, we train a StochasticNet using four different image datasets (CIFAR-10, MNIST, SVHN, and STL-10). Experimental results show that a StochasticNet, using less than half the number of neural connections as a conventional deep neural network, achieves comparable accuracy and reduces overfitting on the CIFAR-10, MNIST and SVHN dataset. Interestingly, StochasticNet with less than half the number of neural connections, achieved a higher accuracy (relative improvement in test error rate of 6 conventional deep neural network. Finally, StochasticNets have faster operational speeds while achieving better or similar accuracy performances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2015

Efficient Deep Feature Learning and Extraction via StochasticNets

Deep neural networks are a powerful tool for feature learning and extrac...
research
09/06/2016

Evolutionary Synthesis of Deep Neural Networks via Synaptic Cluster-driven Genetic Encoding

There has been significant recent interest towards achieving highly effi...
research
07/01/2017

Exploring the Imposition of Synaptic Precision Restrictions For Evolutionary Synthesis of Deep Neural Networks

A key contributing factor to incredible success of deep neural networks ...
research
06/11/2018

Dual Pattern Learning Networks by Empirical Dual Prediction Risk Minimization

Motivated by the observation that humans can learn patterns from two giv...
research
03/05/2020

Permute to Train: A New Dimension to Training Deep Neural Networks

We show that Deep Neural Networks (DNNs) can be efficiently trained by p...
research
05/16/2021

Bayesian reconstruction of memories stored in neural networks from their connectivity

The advent of comprehensive synaptic wiring diagrams of large neural cir...
research
06/15/2020

Interaction Networks: Using a Reinforcement Learner to train other Machine Learning algorithms

The wiring of neurons in the brain is more flexible than the wiring of c...

Please sign up or login with your details

Forgot password? Click here to reset