Butterfly Transform: An Efficient FFT Based Neural Architecture Design

06/05/2019
by   Keivan Alizadeh, et al.
14

In this paper, we introduce the Butterfly Transform (BFT), a light weight channel fusion method that reduces the computational complexity of point-wise convolutions from O(n^2) of conventional solutions to O(n log n) with respect to the number of channels while improving the accuracy of the networks under the same range of FLOPs. The proposed BFT generalizes the Discrete Fourier Transform in a way that its parameters are learned at training time. Our experimental evaluations show that replacing channel fusion modules with results in significant accuracy gains at similar FLOPs across a wide range of network architectures. For example, replacing channel fusion convolutions with BFT offers 3 ShuffleNet V2-0.5 while maintaining the same number of FLOPS. Notably, the ShuffleNet-V2+BFT outperforms state-of-the-art architecture search methods MNasNet tan2018mnasnet and FBNet wu2018fbnet. We also show that the structure imposed by BFT has interesting properties that ensures the efficacy of the resulting network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset