ACDC: A Structured Efficient Linear Layer

11/18/2015
by   Marcin Moczulski, et al.
0

The linear layer is one of the most pervasive modules in deep learning representations. However, it requires O(N^2) parameters and O(N^2) operations. These costs can be prohibitive in mobile applications or prevent scaling in many domains. Here, we introduce a deep, differentiable, fully-connected neural network module composed of diagonal matrices of parameters, A and D, and the discrete cosine transform C. The core module, structured as ACDC^-1, has O(N) parameters and incurs O(N log N ) operations. We present theoretical results showing how deep cascades of ACDC layers approximate linear layers. ACDC is, however, a stand-alone module and can be used in combination with any other types of module. In our experiments, we show that it can indeed be successfully interleaved with ReLU modules in convolutional neural networks for image recognition. Our experiments also study critical factors in the training of these structured modules, including initialization and depth. Finally, this paper also provides a connection between structured linear transforms used in deep learning and the field of Fourier optics, illustrating how ACDC could in principle be implemented with lenses and diffractive elements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2017

EraseReLU: A Simple Way to Ease the Training of Deep Convolution Neural Networks

For most state-of-the-art architectures, Rectified Linear Unit (ReLU) be...
research
01/12/2017

Modularized Morphing of Neural Networks

In this work we study the problem of network morphism, an effective lear...
research
07/17/2020

Sparse Linear Networks with a Fixed Butterfly Structure: Theory and Practice

Fast Fourier transform, Wavelets, and other well-known transforms in sig...
research
12/27/2021

Learning Robust and Lightweight Model through Separable Structured Transformations

With the proliferation of mobile devices and the Internet of Things, dee...
research
11/27/2018

A Fully Sequential Methodology for Convolutional Neural Networks

Recent work has shown that the performance of convolutional neural netwo...
research
12/02/2019

The intriguing role of module criticality in the generalization of deep networks

We study the phenomenon that some modules of deep neural networks (DNNs)...
research
02/25/2016

Extending DUNE: The dune-xt modules

We present our effort to extend and complement the core modules of the D...

Please sign up or login with your details

Forgot password? Click here to reset