On Correlation of Features Extracted by Deep Neural Networks

01/30/2019
by   Babajide O. Ayinde, et al.
16

Redundancy in deep neural network (DNN) models has always been one of their most intriguing and important properties. DNNs have been shown to overparameterize, or extract a lot of redundant features. In this work, we explore the impact of size (both width and depth), activation function, and weight initialization on the susceptibility of deep neural network models to extract redundant features. To estimate the number of redundant features in each layer, all the features of a given layer are hierarchically clustered according to their relative cosine distances in feature space and a set threshold. It is shown that both network size and activation function are the two most important components that foster the tendency of DNNs to extract redundant features. The concept is illustrated using deep multilayer perceptron and convolutional neural networks on MNIST digits recognition and CIFAR-10 dataset, respectively.

READ FULL TEXT

page 6

page 7

research
02/21/2018

Building Efficient ConvNets using Redundant Feature Pruning

This paper presents an efficient technique to prune deep and/or wide con...
research
10/08/2020

Approximating smooth functions by deep neural networks with sigmoid activation function

We study the power of deep neural networks (DNNs) with sigmoid activatio...
research
08/09/2017

Probabilistic Neural Network with Complex Exponential Activation Functions in Image Recognition using Deep Learning Framework

If the training dataset is not very large, image recognition is usually ...
research
08/22/2017

Nonparametric regression using deep neural networks with ReLU activation function

Consider the multivariate nonparametric regression model. It is shown th...
research
06/19/2022

0/1 Deep Neural Networks via Block Coordinate Descent

The step function is one of the simplest and most natural activation fun...
research
11/17/2018

Stacking-Based Deep Neural Network: Deep Analytic Network for Pattern Classification

Stacking-based deep neural network (S-DNN) is aggregated with pluralitie...
research
12/28/2021

Reduced Softmax Unit for Deep Neural Network Accelerators

The Softmax activation layer is a very popular Deep Neural Network (DNN)...

Please sign up or login with your details

Forgot password? Click here to reset