Composite Binary Decomposition Networks

11/16/2018
by   You Qiaoben, et al.
0

Binary neural networks have great resource and computing efficiency, while suffer from long training procedure and non-negligible accuracy drops, when comparing to the full-precision counterparts. In this paper, we propose the composite binary decomposition networks (CBDNet), which first compose real-valued tensor of each layer with a limited number of binary tensors, and then decompose some conditioned binary tensors into two low-rank binary tensors, so that the number of parameters and operations are greatly reduced comparing to the original ones. Experiments demonstrate the effectiveness of the proposed method, as CBDNet can approximate image classification network ResNet-18 using 5.25 bits, VGG-16 using 5.47 bits, DenseNet-121 using 5.72 bits, object detection networks SSD300 using 4.38 bits, and semantic segmentation networks SegNet using 5.18 bits, all with minor accuracy drops.

READ FULL TEXT
research
02/14/2020

Back-and-Forth prediction for deep tensor compression

Recent AI applications such as Collaborative Intelligence with neural ne...
research
01/31/2020

Near-Lossless Post-Training Quantization of Deep Neural Networks via a Piecewise Linear Approximation

Quantization plays an important role for energy-efficient deployment of ...
research
09/01/2015

A Telescopic Binary Learning Machine for Training Neural Networks

This paper proposes a new algorithm based on multi-scale stochastic loca...
research
08/18/2015

Zero-Truncated Poisson Tensor Factorization for Massive Binary Tensors

We present a scalable Bayesian model for low-rank factorization of massi...
research
05/30/2023

Machine learning with tree tensor networks, CP rank constraints, and tensor dropout

Tensor networks approximate order-N tensors with a reduced number of deg...
research
09/22/2019

Structured Binary Neural Networks for Image Recognition

We propose methods to train convolutional neural networks (CNNs) with bo...
research
06/18/2019

ADA-Tucker: Compressing Deep Neural Networks via Adaptive Dimension Adjustment Tucker Decomposition

Despite the recent success of deep learning models in numerous applicati...

Please sign up or login with your details

Forgot password? Click here to reset