Newton Method over Networks is Fast up to the Statistical Precision

02/12/2021
by   Amir Daneshmand, et al.
0

We propose a distributed cubic regularization of the Newton method for solving (constrained) empirical risk minimization problems over a network of agents, modeled as undirected graph. The algorithm employs an inexact, preconditioned Newton step at each agent's side: the gradient of the centralized loss is iteratively estimated via a gradient-tracking consensus mechanism and the Hessian is subsampled over the local data sets. No Hessian matrices are thus exchanged over the network. We derive global complexity bounds for convex and strongly convex losses. Our analysis reveals an interesting interplay between sample and iteration/communication complexity: statistically accurate solutions are achievable in roughly the same number of iterations of the centralized cubic Newton method, with a communication cost per iteration of the order of 𝒪(1/√(1-ρ)), where ρ characterizes the connectivity of the network. This demonstrates a significant communication saving with respect to that of existing, statistically oblivious, distributed Newton-based methods over networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2020

A Distributed Cubic-Regularized Newton Method for Smooth Convex Optimization over Networks

We propose a distributed, cubic-regularized Newton method for large-scal...
research
12/01/2022

Second-order optimization with lazy Hessians

We analyze Newton's method with lazy Hessian updates for solving general...
research
06/07/2023

Quasi-Newton Updating for Large-Scale Distributed Learning

Distributed computing is critically important for modern statistical ana...
research
07/13/2020

Inverse Cubic Iteration

There are thousands of papers on rootfinding for nonlinear scalar equati...
research
04/06/2022

A Hessian inversion-free exact second order method for distributed consensus optimization

We consider a standard distributed consensus optimization problem where ...
research
10/26/2018

Efficient Distributed Hessian Free Algorithm for Large-scale Empirical Risk Minimization via Accumulating Sample Strategy

In this paper, we propose a Distributed Accumulated Newton Conjugate gra...
research
11/28/2018

First-order Newton-type Estimator for Distributed Estimation and Inference

This paper studies distributed estimation and inference for a general st...

Please sign up or login with your details

Forgot password? Click here to reset