CrossNets : A New Approach to Complex Learning

05/21/2017
by   Chirag Agarwal, et al.
0

We propose a novel neural network structure called CrossNets, which considers architectures on directed acyclic graphs. This structure builds on previous generalizations of feed forward models, such as ResNets, by allowing for all forward cross connections between layers (both adjacent and non-adjacent). The addition of cross connections among the network increases information flow across the whole network, leading to better training and testing performances. The superior performance of the network is tested against four benchmark datasets: MNIST, CIFAR-10, CIFAR-100, and SVHN. We conclude with a proof of convergence for Crossnets to a local minimum for error, where weights for connections are chosen through backpropagation with momentum.

READ FULL TEXT

page 5

page 6

page 8

research
10/20/2020

Implicit recurrent networks: A novel approach to stationary input processing with recurrent neural networks in deep learning

The brain cortex, which processes visual, auditory and sensory data in t...
research
01/23/2018

Dynamic Optimization of Neural Network Structures Using Probabilistic Modeling

Deep neural networks (DNNs) are powerful machine learning models and hav...
research
05/11/2019

Deep Learning: a new definition of artificial neuron with double weight

Deep learning is a subset of a broader family of machine learning method...
research
02/28/2011

Improving the character recognition efficiency of feed forward BP neural network

This work is focused on improving the character recognition capability o...
research
03/10/2021

Reframing Neural Networks: Deep Structure in Overcomplete Representations

In comparison to classical shallow representation learning techniques, d...
research
10/30/2017

A Connection between Feed-Forward Neural Networks and Probabilistic Graphical Models

Two of the most popular modelling paradigms in computer vision are feed-...

Please sign up or login with your details

Forgot password? Click here to reset