Distributed Averaging Methods for Randomized Second Order Optimization

02/16/2020
by   Burak Bartan, et al.
0

We consider distributed optimization problems where forming the Hessian is computationally challenging and communication is a significant bottleneck. We develop unbiased parameter averaging methods for randomized second order optimization that employ sampling and sketching of the Hessian. Existing works do not take the bias of the estimators into consideration, which limits their application to massively parallel computation. We provide closed-form formulas for regularization parameters and step sizes that provably minimize the bias for sketched Newton directions. We also extend the framework of second order averaging methods to introduce an unbiased distributed optimization framework for heterogeneous computing systems with varying worker resources. Additionally, we demonstrate the implications of our theoretical findings via large scale experiments performed on a serverless computing platform.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2022

Distributed Sketching for Randomized Optimization: Exact Characterization, Concentration and Lower Bounds

We consider distributed optimization methods for problems where forming ...
research
07/02/2020

Debiasing Distributed Second Order Optimization with Surrogate Sketching and Scaled Regularization

In distributed second order optimization, a standard strategy is to aver...
research
09/05/2017

A Generic Approach for Escaping Saddle points

A central challenge to using first-order methods for optimizing nonconve...
research
12/08/2021

Learning Linear Models Using Distributed Iterative Hessian Sketching

This work considers the problem of learning the Markov parameters of a l...
research
02/16/2020

Distributed Sketching Methods for Privacy Preserving Regression

In this work, we study distributed sketching methods for large scale reg...
research
06/20/2018

A Distributed Second-Order Algorithm You Can Trust

Due to the rapid growth of data and computational resources, distributed...
research
05/26/2022

Faster Optimization on Sparse Graphs via Neural Reparametrization

In mathematical optimization, second-order Newton's methods generally co...

Please sign up or login with your details

Forgot password? Click here to reset