Rank-1 Convolutional Neural Network

08/13/2018
by   Hyein Kim, et al.
0

In this paper, we propose a convolutional neural network(CNN) with 3-D rank-1 filters which are composed by the outer product of 1-D filters. After being trained, the 3-D rank-1 filters can be decomposed into 1-D filters in the test time for fast inference. The reason that we train 3-D rank-1 filters in the training stage instead of consecutive 1-D filters is that a better gradient flow can be obtained with this setting, which makes the training possible even in the case where the network with consecutive 1-D filters cannot be trained. The 3-D rank-1 filters are updated by both the gradient flow and the outer product of the 1-D filters in every epoch, where the gradient flow tries to obtain a solution which minimizes the loss function, while the outer product operation tries to make the parameters of the filter to live on a rank-1 sub-space. Furthermore, we show that the convolution with the rank-1 filters results in low rank outputs, constraining the final output of the CNN also to live on a low dimensional subspace.

READ FULL TEXT
research
12/17/2014

Flattened Convolutional Neural Networks for Feedforward Acceleration

We present flattened convolutional neural networks that are designed for...
research
09/27/2016

Learning convolutional neural network to maximize Pos@Top performance measure

In the machine learning problems, the performance measure is used to eva...
research
05/03/2019

Convolution is outer product

The inner product operation between tensors is the corner stone of deep ...
research
06/28/2018

Automatic Rank Selection for High-Speed Convolutional Neural Network

Low-rank decomposition plays a central role in accelerating convolutiona...
research
02/12/2018

DCFNet: Deep Neural Network with Decomposed Convolutional Filters

Filters in a Convolutional Neural Network (CNN) contain model parameters...
research
10/09/2018

The Outer Product Structure of Neural Network Derivatives

In this paper, we show that feedforward and recurrent neural networks ex...
research
09/24/2019

Scale-Equivariant Neural Networks with Decomposed Convolutional Filters

Encoding the input scale information explicitly into the representation ...

Please sign up or login with your details

Forgot password? Click here to reset