Distributed Learning in the Non-Convex World: From Batch to Streaming Data, and Beyond

01/14/2020
by   Tsung-Hui Chang, et al.
0

Distributed learning has become a critical enabler of the massively connected world envisioned by many. This article discusses four key elements of scalable distributed processing and real-time intelligence — problems, data, communication and computation. Our aim is to provide a fresh and unique perspective about how these elements should work together in an effective and coherent manner. In particular, we provide a selective review about the recent techniques developed for optimizing non-convex models (i.e., problem classes), processing batch and streaming data (i.e., data types), over the networks in a distributed manner (i.e., communication and computation paradigm). We describe the intuitions and connections behind a core set of popular distributed algorithms, emphasizing how to trade off between computation and communication costs. Practical issues and future research directions will also be discussed.

READ FULL TEXT
research
05/18/2020

Scaling-up Distributed Processing of Data Streams for Machine Learning

Emerging applications of machine learning in numerous areas involve cont...
research
03/22/2016

Information Processing by Nonlinear Phase Dynamics in Locally Connected Arrays

Research toward powerful information processing systems that circumvent ...
research
05/22/2020

FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data

Federated Learning (FL) has become a popular paradigm for learning from ...
research
01/26/2023

Towards 6G Hyper-Connectivity: Vision, Challenges, and Key Enabling Technologies

Technology forecasts anticipate a new era in which massive numbers of hu...
research
01/18/2021

DFOGraph: An I/O- and Communication-Efficient System for Distributed Fully-out-of-Core Graph Processing

With the magnitude of graph-structured data continually increasing, grap...
research
07/31/2022

Online Decentralized Frank-Wolfe: From theoretical bound to applications in smart-building

The design of decentralized learning algorithms is important in the fast...
research
03/03/2021

Critical Parameters for Scalable Distributed Learning with Large Batches and Asynchronous Updates

It has been experimentally observed that the efficiency of distributed t...

Please sign up or login with your details

Forgot password? Click here to reset