DeepAI AI Chat
Log In Sign Up

Distributed Principal Component Analysis with Limited Communication

by   Foivos Alimisis, et al.

We study efficient distributed algorithms for the fundamental problem of principal component analysis and leading eigenvector computation on the sphere, when the data are randomly distributed among a set of computational nodes. We propose a new quantized variant of Riemannian gradient descent to solve this problem, and prove that the algorithm converges with high probability under a set of necessary spherical-convexity properties. We give bounds on the number of bits transmitted by the algorithm under common initialization schemes, and investigate the dependency on the problem dimension in each case.


page 1

page 2

page 3

page 4


Quantum image classification using principal component analysis

We present a novel quantum algorithm for classification of images. The a...

Task-Aware Network Coding Over Butterfly Network

Network coding allows distributed information sources such as sensors to...

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression pro...

Overcomplete Independent Component Analysis via SDP

We present a novel algorithm for overcomplete independent components ana...

Communication-efficient distributed eigenspace estimation

Distributed computing is a standard way to scale up machine learning and...

Diffusion Approximations for Online Principal Component Estimation and Global Convergence

In this paper, we propose to adopt the diffusion approximation tools to ...

Distributed Stochastic Algorithms for High-rate Streaming Principal Component Analysis

This paper considers the problem of estimating the principal eigenvector...