An exploration of parameter redundancy in deep networks with circulant projections

02/11/2015
by   Yu Cheng, et al.
0

We explore the redundancy of parameters in deep neural networks by replacing the conventional linear projection in fully-connected layers with the circulant projection. The circulant structure substantially reduces memory footprint and enables the use of the Fast Fourier Transform to speed up the computation. Considering a fully-connected neural network layer with d input nodes, and d output nodes, this method improves the time complexity from O(d^2) to O(dlogd) and space complexity from O(d^2) to O(d). The space savings are particularly important for modern deep convolutional neural network architectures, where fully-connected layers typically contain more than 90 parameters. We further show that the gradient computation and optimization of the circulant projections can be performed very efficiently. Our experiments on three standard datasets show that the proposed approach achieves this significant gain in storage and efficiency with minimal increase in error rate compared to neural networks with unstructured projections.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2020

Parameter Efficient Deep Neural Networks with Bilinear Projections

Recent research on deep neural networks (DNNs) has primarily focused on ...
research
12/22/2014

Deep Fried Convnets

The fully connected layers of a deep convolutional neural network typica...
research
10/21/2017

Deep Neural Network Approximation using Tensor Sketching

Deep neural networks are powerful learning models that achieve state-of-...
research
06/09/2022

Redundancy in Deep Linear Neural Networks

Conventional wisdom states that deep linear neural networks benefit from...
research
10/16/2019

Path homologies of deep feedforward networks

We provide a characterization of two types of directed homology for full...
research
06/28/2020

VPNet: Variable Projection Networks

In this paper, we introduce VPNet, a novel model-driven neural network a...
research
07/03/2019

Spatially-Coupled Neural Network Architectures

In this work, we leverage advances in sparse coding techniques to reduce...

Please sign up or login with your details

Forgot password? Click here to reset