Separable Layers Enable Structured Efficient Linear Substitutions

06/03/2019
by   Gavin Gray, et al.
2

In response to the development of recent efficient dense layers, this paper shows that something as simple as replacing linear components in pointwise convolutions with structured linear decompositions also produces substantial gains in the efficiency/accuracy tradeoff. Pointwise convolutions are fully connected layers and are thus prepared for replacement by structured transforms. Networks using such layers are able to learn the same tasks as those using standard convolutions, and provide Pareto-optimal benefits in efficiency/accuracy, both in terms of computation (mult-adds) and parameter count (and hence memory). Code is available at https://github.com/BayesWatch/deficient-efficient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2020

Optimizing Grouped Convolutions on Edge Devices

When deploying a deep neural network on constrained hardware, it is poss...
research
03/25/2022

Deformable Butterfly: A Highly Structured and Sparse Linear Transform

We introduce a new kind of linear transform named Deformable Butterfly (...
research
06/09/2021

Exploiting Learned Symmetries in Group Equivariant Convolutions

Group Equivariant Convolutions (GConvs) enable convolutional neural netw...
research
04/14/2021

Orthogonalizing Convolutional Layers with the Cayley Transform

Recent work has highlighted several advantages of enforcing orthogonalit...
research
12/13/2017

Rethinking Spatiotemporal Feature Learning For Video Understanding

In this paper we study 3D convolutional networks for video understanding...
research
03/31/2021

Compressing 1D Time-Channel Separable Convolutions using Sparse Random Ternary Matrices

We demonstrate that 1x1-convolutions in 1D time-channel separable convol...
research
12/04/2019

L3 Fusion: Fast Transformed Convolutions on CPUs

Fast convolutions via transforms, either Winograd or FFT, had emerged as...

Please sign up or login with your details

Forgot password? Click here to reset