Learning Fast Algorithms for Linear Transforms Using Butterfly Factorizations

03/14/2019
∙
by   Tri Dao, et al.
∙
28
∙

Fast linear transforms are ubiquitous in machine learning, including the discrete Fourier transform, discrete cosine transform, and other structured transformations such as convolutions. All of these transforms can be represented by dense matrix-vector multiplication, yet each has a specialized and highly efficient (subquadratic) algorithm. We ask to what extent hand-crafting these algorithms and implementations is necessary, what structural priors they encode, and how much knowledge is required to automatically learn a fast algorithm for a provided structured transform. Motivated by a characterization of fast matrix-vector multiplication as products of sparse matrices, we introduce a parameterization of divide-and-conquer methods that is capable of representing a large class of transforms. This generic formulation can automatically learn an efficient algorithm for many important transforms; for example, it recovers the O(N N) Cooley-Tukey FFT algorithm to machine precision, for dimensions N up to 1024. Furthermore, our method can be incorporated as a lightweight replacement of generic matrices in machine learning pipelines to learn efficient and compressible transformations. On a standard task of compressing a single hidden-layer network, our method exceeds the classification accuracy of unconstrained matrices on CIFAR-10 by 3.9 points---the first time a structured approach has done so---with 4X faster inference speed and 40X fewer parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
∙ 12/29/2020

Kaleidoscope: An Efficient, Learnable Representation For All Structured Linear Maps

Modern neural network architectures use structured linear transformation...
research
∙ 04/24/2020

An extra-components method for evaluating fast matrix-vector multiplication with special functions

In calculating integral or discrete transforms, fast algorithms for mult...
research
∙ 12/09/2018

Learning Multiplication-free Linear Transformations

In this paper, we propose several dictionary learning algorithms for spa...
research
∙ 10/06/2021

Use of Deterministic Transforms to Design Weight Matrices of a Neural Network

Self size-estimating feedforward network (SSFN) is a feedforward multila...
research
∙ 11/19/2018

Approximate Eigenvalue Decompositions of Linear Transformations with a Few Householder Reflectors

The ability to decompose a signal in an orthonormal basis (a set of orth...
research
∙ 07/17/2020

Sparse Linear Networks with a Fixed Butterfly Structure: Theory and Practice

Fast Fourier transform, Wavelets, and other well-known transforms in sig...
research
∙ 09/11/2019

Faster Johnson-Lindenstrauss Transforms via Kronecker Products

The Kronecker product is an important matrix operation with a wide range...

Please sign up or login with your details

Forgot password? Click here to reset