BT-Nets: Simplifying Deep Neural Networks via Block Term Decomposition

12/15/2017
by   Guangxi Li, et al.
0

Recently, deep neural networks (DNNs) have been regarded as the state-of-the-art classification methods in a wide range of applications, especially in image classification. Despite the success, the huge number of parameters blocks its deployment to situations with light computing resources. Researchers resort to the redundancy in the weights of DNNs and attempt to find how fewer parameters can be chosen while preserving the accuracy at the same time. Although several promising results have been shown along this research line, most existing methods either fail to significantly compress a well-trained deep network or require a heavy fine-tuning process for the compressed network to regain the original performance. In this paper, we propose the Block Term networks (BT-nets) in which the commonly used fully-connected layers (FC-layers) are replaced with block term layers (BT-layers). In BT-layers, the inputs and the outputs are reshaped into two low-dimensional high-order tensors, then block-term decomposition is applied as tensor operators to connect them. We conduct extensive experiments on benchmark datasets to demonstrate that BT-layers can achieve a very large compression ratio on the number of parameters while preserving the representation power of the original FC-layers as much as possible. Specifically, we can get a higher performance while requiring fewer parameters compared with the tensor train method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2020

Block-term Tensor Neural Networks

Deep neural networks (DNNs) have achieved outstanding performance in a w...
research
05/22/2017

Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain Surgeon

How to develop slim and accurate deep neural networks has become crucial...
research
07/13/2021

Data-Driven Low-Rank Neural Network Compression

Despite many modern applications of Deep Neural Networks (DNNs), the lar...
research
02/25/2018

Wide Compression: Tensor Ring Nets

Deep neural networks have demonstrated state-of-the-art performance in a...
research
05/11/2017

Incremental Learning Through Deep Adaptation

Given an existing trained neural network, it is often desirable to be ab...
research
06/20/2020

Deep Polynomial Neural Networks

Deep Convolutional Neural Networks (DCNNs) are currently the method of c...
research
08/28/2023

Entropy-based Guidance of Deep Neural Networks for Accelerated Convergence and Improved Performance

Neural networks have dramatically increased our capacity to learn from l...

Please sign up or login with your details

Forgot password? Click here to reset