On Distributed Quantization for Classification

11/01/2019
by   Osama A. Hanna, et al.
5

We consider the problem of distributed feature quantization, where the goal is to enable a pretrained classifier at a central node to carry out its classification on features that are gathered from distributed nodes through communication constrained channels. We propose the design of distributed quantization schemes specifically tailored to the classification task: unlike quantization schemes that help the central node reconstruct the original signal as accurately as possible, our focus is not reconstruction accuracy, but instead correct classification. Our work does not make any apriori distributional assumptions on the data, but instead uses training data for the quantizer design. Our main contributions include: we prove NP-hardness of finding optimal quantizers in the general case; we design an optimal scheme for a special case; we propose quantization algorithms, that leverage discrete neural representations and training data, and can be designed in polynomial-time for any number of features, any number of classes, and arbitrary division of features across the distributed nodes. We find that tailoring the quantizers to the classification task can offer significant savings: as compared to alternatives, we can achieve more than a factor of two reduction in terms of the number of bits communicated, for the same classification accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2021

Smoothness-Aware Quantization Techniques

Distributed machine learning has become an indispensable tool for traini...
research
07/13/2022

Learning Representations for CSI Adaptive Quantization and Feedback

In this work, we propose an efficient method for channel state informati...
research
09/14/2021

On Distributed Learning with Constant Communication Bits

In this paper, we study a distributed learning problem constrained by co...
research
06/02/2023

Adaptive Message Quantization and Parallelization for Distributed Full-graph GNN Training

Distributed full-graph training of Graph Neural Networks (GNNs) over lar...
research
05/07/2017

Learning of Gaussian Processes in Distributed and Communication Limited Systems

It is of fundamental importance to find algorithms obtaining optimal per...
research
06/19/2020

DEED: A General Quantization Scheme for Communication Efficiency in Bits

In distributed optimization, a popular technique to reduce communication...
research
10/25/2021

Demystifying and Generalizing BinaryConnect

BinaryConnect (BC) and its many variations have become the de facto stan...

Please sign up or login with your details

Forgot password? Click here to reset