Understanding Locally Competitive Networks

10/05/2014
by   Rupesh Kumar Srivastava, et al.
0

Recently proposed neural network activation functions such as rectified linear, maxout, and local winner-take-all have allowed for faster and more effective training of deep neural architectures on large and complex datasets. The common trait among these functions is that they implement local competition between small groups of computational units within a layer, so that only part of the network is activated for any given input pattern. In this paper, we attempt to visualize and understand this self-modularization, and suggest a unified explanation for the beneficial properties of such networks. We also show how our insights can be directly useful for efficiently performing retrieval over large datasets using neural networks.

READ FULL TEXT

page 8

page 10

research
05/04/2022

Most Activation Functions Can Win the Lottery Without Excessive Depth

The strong lottery ticket hypothesis has highlighted the potential for t...
research
06/26/2018

Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks

The most widely used activation functions in current deep feed-forward n...
research
04/06/2022

A survey on recently proposed activation functions for Deep Learning

Artificial neural networks (ANN), typically referred to as neural networ...
research
02/10/2020

Reducing the Computational Burden of Deep Learning with Recursive Local Representation Alignment

Training deep neural networks on large-scale datasets requires significa...
research
10/15/2019

The Local Elasticity of Neural Networks

This paper presents a phenomenon in neural networks that we refer to as ...
research
07/14/2017

On the Complexity of Learning Neural Networks

The stunning empirical successes of neural networks currently lack rigor...
research
10/07/2021

A Data-Centric Approach for Training Deep Neural Networks with Less Data

While the availability of large datasets is perceived to be a key requir...

Please sign up or login with your details

Forgot password? Click here to reset