Speeding-up Convolutional Neural Networks Using Fine-tuned CP-Decomposition

12/19/2014
by   Vadim Lebedev, et al.
0

We propose a simple two-step approach for speeding up convolution layers within large convolutional neural networks based on tensor decomposition and discriminative fine-tuning. Given a layer, we use non-linear least squares to compute a low-rank CP-decomposition of the 4D convolution kernel tensor into a sum of a small number of rank-one tensors. At the second step, this decomposition is used to replace the original convolutional layer with a sequence of four convolutional layers with small kernels. After such replacement, the entire network is fine-tuned on the training data using standard backpropagation process. We evaluate this approach on two CNNs and show that it is competitive with previous approaches, leading to higher obtained CPU speedups at the cost of lower accuracy drops for the smaller of the two networks. Thus, for the 36-class character classification CNN, our approach obtains a 8.5x CPU speedup of the whole network with only minor accuracy drop (1 the standard ImageNet architecture (AlexNet), the approach speeds up the second convolution layer by a factor of 4x at the cost of 1% increase of the overall top-5 classification error.

READ FULL TEXT
research
08/12/2020

Stable Low-rank Tensor Decomposition for Compression of Convolutional Neural Network

Most state of the art deep neural networks are overparameterized and exh...
research
01/16/2018

Rank Selection of CP-decomposed Convolutional Layers with Variational Bayesian Matrix Factorization

Convolutional Neural Networks (CNNs) is one of successful method in many...
research
05/28/2020

CPAC-Conv: CP-decomposition to Approximately Compress Convolutional Layers in Deep Learning

Feature extraction for tensor data serves as an important step in many t...
research
09/10/2019

Accelerating Training using Tensor Decomposition

Tensor decomposition is one of the well-known approaches to reduce the l...
research
05/24/2019

Training decision trees as replacement for convolution layers

We present an alternative layer to convolution layers in convolutional n...
research
05/30/2023

Machine learning with tree tensor networks, CP rank constraints, and tensor dropout

Tensor networks approximate order-N tensors with a reduced number of deg...
research
08/13/2019

Einconv: Exploring Unexplored Tensor Decompositions for Convolutional Neural Networks

Tensor decomposition methods are one of the primary approaches for model...

Please sign up or login with your details

Forgot password? Click here to reset