Privacy-Preserving Generalized Linear Models using Distributed Block Coordinate Descent

11/08/2019
by   Erik-Jan van Kesteren, et al.
0

Combining data from varied sources has considerable potential for knowledge discovery: collaborating data parties can mine data in an expanded feature space, allowing them to explore a larger range of scientific questions. However, data sharing among different parties is highly restricted by legal conditions, ethical concerns, and / or data volume. Fueled by these concerns, the fields of cryptography and distributed learning have made great progress towards privacy-preserving and distributed data mining. However, practical implementations have been hampered by the limited scope or computational complexity of these methods. In this paper, we greatly extend the range of analyses available for vertically partitioned data, i.e., data collected by separate parties with different features on the same subjects. To this end, we present a novel approach for privacy-preserving generalized linear models, a fundamental and powerful framework underlying many prediction and classification procedures. We base our method on a distributed block coordinate descent algorithm to obtain parameter estimates, and we develop an extension to compute accurate standard errors without additional communication cost. We critically evaluate the information transfer for semi-honest collaborators and show that our protocol is secure against data reconstruction. Through both simulated and real-world examples we illustrate the functionality of our proposed algorithm. Without leaking information, our method performs as well on vertically partitioned data as existing methods on combined data – all within mere minutes of computation time. We conclude that our method is a viable approach for vertically partitioned data analysis with a wide range of real-world applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2021

FedV: Privacy-Preserving Federated Learning over Vertically Partitioned Data

Federated learning (FL) has been proposed to allow collaborative trainin...
research
05/18/2023

Free Lunch for Privacy Preserving Distributed Graph Learning

Learning on graphs is becoming prevalent in a wide range of applications...
research
03/01/2017

Preserving Differential Privacy Between Features in Distributed Estimation

Privacy is crucial in many applications of machine learning. Legal, ethi...
research
07/22/2023

Towards Vertical Privacy-Preserving Symbolic Regression via Secure Multiparty Computation

Symbolic Regression is a powerful data-driven technique that searches fo...
research
11/07/2016

Distributed Coordinate Descent for Generalized Linear Models with Regularization

Generalized linear model with L_1 and L_2 regularization is a widely use...
research
09/16/2022

Federated Coordinate Descent for Privacy-Preserving Multiparty Linear Regression

Distributed privacy-preserving regression schemes have been developed an...
research
10/14/2022

Privacy-Preserving and Lossless Distributed Estimation of High-Dimensional Generalized Additive Mixed Models

Various privacy-preserving frameworks that respect the individual's priv...

Please sign up or login with your details

Forgot password? Click here to reset