Federated Coordinate Descent for Privacy-Preserving Multiparty Linear Regression

09/16/2022
by   Xinlin Leng, et al.
0

Distributed privacy-preserving regression schemes have been developed and extended in various fields, where multiparty collaboratively and privately run optimization algorithms, e.g., Gradient Descent, to learn a set of optimal parameters. However, traditional Gradient-Descent based methods fail to solve problems which contains objective functions with L1 regularization, such as Lasso regression. In this paper, we present Federated Coordinate Descent, a new distributed scheme called FCD, to address this issue securely under multiparty scenarios. Specifically, through secure aggregation and added perturbations, our scheme guarantees that: (1) no local information is leaked to other parties, and (2) global model parameters are not exposed to cloud servers. The added perturbations can eventually be eliminated by each party to derive a global model with high performance. We show that the FCD scheme fills the gap of multiparty secure Coordinate Descent methods and is applicable for general linear regressions, including linear, ridge and lasso regressions. Theoretical security analysis and experimental results demonstrate that FCD can be performed effectively and efficiently, and provide as low MAE measure as centralized methods under tasks of three types of linear regressions on real-world UCI datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2020

PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks

Federated Learning (FL) enables a large number of users to jointly learn...
research
01/26/2019

A Practical Scheme for Two-Party Private Linear Least Squares

Privacy-preserving machine learning is learning from sensitive datasets ...
research
04/27/2021

Confined Gradient Descent: Privacy-preserving Optimization for Federated Learning

Federated learning enables multiple participants to collaboratively trai...
research
06/02/2020

Acceleration of Descent-based Optimization Algorithms via Carathéodory's Theorem

We propose a new technique to accelerate algorithms based on Gradient De...
research
11/08/2019

Privacy-Preserving Generalized Linear Models using Distributed Block Coordinate Descent

Combining data from varied sources has considerable potential for knowle...
research
07/09/2022

Federated Learning with Quantum Secure Aggregation

This article illustrates a novel Quantum Secure Aggregation (QSA) scheme...
research
10/02/2017

Lasso Regularization Paths for NARMAX Models via Coordinate Descent

We propose a new algorithm for estimating NARMAX models with L1 regulari...

Please sign up or login with your details

Forgot password? Click here to reset