Communication-efficient Distributed Newton-like Optimization with Gradients and M-estimators

07/13/2022
by   Ziyan Yin, et al.
0

In modern data science, it is common that large-scale data are stored and processed parallelly across a great number of locations. For reasons including confidentiality concerns, only limited data information from each parallel center is eligible to be transferred. To solve these problems more efficiently, a group of communication-efficient methods are being actively developed. We propose two communication-efficient Newton-type algorithms, combining the M-estimator and the gradient collected from each data center. They are created by constructing two Fisher information estimators globally with those communication-efficient statistics. Enjoying a higher rate of convergence, this framework improves upon existing Newton-like methods. Moreover, we present two bias-adjusted one-step distributed estimators. When the square of the center-wise sample size is of a greater magnitude than the total number of centers, they are as efficient as the global M-estimator asymptotically. The advantages of our methods are illustrated by extensive theoretical and empirical evidences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2013

Communication Efficient Distributed Optimization using an Approximate Newton-type Method

We present a novel Newton-type method for distributed optimization, whic...
research
06/10/2021

Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach

In this paper, we study the application of quasi-Newton methods for solv...
research
06/07/2023

Quasi-Newton Updating for Large-Scale Distributed Learning

Distributed computing is critically important for modern statistical ana...
research
06/12/2019

Communication-Efficient Accurate Statistical Estimation

When the data are stored in a distributed manner, direct application of ...
research
09/11/2017

GIANT: Globally Improved Approximate Newton Method for Distributed Optimization

For distributed computing environments, we consider the canonical machin...
research
02/18/2020

Distributed Adaptive Newton Methods with Globally Superlinear Convergence

This paper considers the distributed optimization problem over a network...
research
09/26/2018

Learning Preconditioners on Lie Groups

We study two types of preconditioners and preconditioned stochastic grad...

Please sign up or login with your details

Forgot password? Click here to reset