Learning Compressed Transforms with Low Displacement Rank

10/04/2018
by   Anna T. Thomas, et al.
2

The low displacement rank (LDR) framework for structured matrices represents a matrix through two displacement operators and a low-rank residual. Existing use of LDR matrices in deep learning has applied fixed displacement operators encoding forms of shift invariance akin to convolutions. We introduce a rich class of LDR matrices with more general displacement operators, and explicitly learn over both the operators and the low-rank component. This class generalizes several previous constructions while preserving compression and efficient computation. We prove bounds on the VC dimension of multi-layer neural networks with structured weight matrices and show empirically that our compact parameterization can reduce the sample complexity of learning. When replacing weight layers in fully-connected, convolutional, and recurrent neural networks for image classification and language modeling tasks, our new classes exceed the accuracy of existing compression approaches, and on some tasks even outperform general unstructured layers while using more than 20X fewer parameters.

READ FULL TEXT

page 9

page 16

page 26

page 27

research
03/01/2017

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

Recently low displacement rank (LDR) matrices, or so-called structured m...
research
10/06/2015

Structured Transforms for Small-Footprint Deep Learning

We consider the task of building compact deep learning pipelines suitabl...
research
12/29/2020

Kaleidoscope: An Efficient, Learnable Representation For All Structured Linear Maps

Modern neural network architectures use structured linear transformation...
research
03/07/2021

Spectral Tensor Train Parameterization of Deep Learning Layers

We study low-rank parameterizations of weight matrices with embedded spe...
research
12/31/2015

Exploiting Local Structures with the Kronecker Layer in Convolutional Networks

In this paper, we propose and study a technique to reduce the number of ...
research
12/31/2021

Training and Generating Neural Networks in Compressed Weight Space

The inputs and/or outputs of some neural nets are weight matrices of oth...
research
10/25/2019

LPRNet: Lightweight Deep Network by Low-rank Pointwise Residual Convolution

Deep learning has become popular in recent years primarily due to the po...

Please sign up or login with your details

Forgot password? Click here to reset