DeepAI AI Chat
Log In Sign Up

Distributed Principal Component Analysis with Limited Communication

10/27/2021
by   Foivos Alimisis, et al.
0

We study efficient distributed algorithms for the fundamental problem of principal component analysis and leading eigenvector computation on the sphere, when the data are randomly distributed among a set of computational nodes. We propose a new quantized variant of Riemannian gradient descent to solve this problem, and prove that the algorithm converges with high probability under a set of necessary spherical-convexity properties. We give bounds on the number of bits transmitted by the algorithm under common initialization schemes, and investigate the dependency on the problem dimension in each case.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/02/2015

Quantum image classification using principal component analysis

We present a novel quantum algorithm for classification of images. The a...
01/28/2022

Task-Aware Network Coding Over Butterfly Network

Network coding allows distributed information sources such as sensors to...
05/15/2014

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression pro...
01/24/2019

Overcomplete Independent Component Analysis via SDP

We present a novel algorithm for overcomplete independent components ana...
09/05/2020

Communication-efficient distributed eigenspace estimation

Distributed computing is a standard way to scale up machine learning and...
08/29/2018

Diffusion Approximations for Online Principal Component Estimation and Global Convergence

In this paper, we propose to adopt the diffusion approximation tools to ...
01/04/2020

Distributed Stochastic Algorithms for High-rate Streaming Principal Component Analysis

This paper considers the problem of estimating the principal eigenvector...