Demystifying and Generalizing BinaryConnect

by   Tim Dockhorn, et al.

BinaryConnect (BC) and its many variations have become the de facto standard for neural network quantization. However, our understanding of the inner workings of BC is still quite limited. We attempt to close this gap in four different aspects: (a) we show that existing quantization algorithms, including post-training quantization, are surprisingly similar to each other; (b) we argue for proximal maps as a natural family of quantizers that is both easy to design and analyze; (c) we refine the observation that BC is a special case of dual averaging, which itself is a special case of the generalized conditional gradient algorithm; (d) consequently, we propose ProxConnect (PC) as a generalization of BC and we prove its convergence properties by exploiting the established connections. We conduct experiments on CIFAR-10 and ImageNet, and verify that PC achieves competitive performance.



page 1

page 2

page 3

page 4


Relaxed Quantization for Discretized Neural Networks

Neural network quantization has become an important research area due to...

Hybrid and Non-Uniform quantization methods using retro synthesis data for efficient inference

Existing quantization aware training methods attempt to compensate for t...

Mixed Link Networks

Basing on the analysis by revealing the equivalence of modern networks, ...

Universally Quantized Neural Compression

A popular approach to learning encoders for lossy compression is to use ...

MQBench: Towards Reproducible and Deployable Model Quantization Benchmark

Model quantization has emerged as an indispensable technique to accelera...

A2C is a special case of PPO

Advantage Actor-critic (A2C) and Proximal Policy Optimization (PPO) are ...

On Distributed Quantization for Classification

We consider the problem of distributed feature quantization, where the g...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.