SecureGBM: Secure Multi-Party Gradient Boosting

11/27/2019
by   Zhi Fengy, et al.
0

Federated machine learning systems have been widely used to facilitate the joint data analytics across the distributed datasets owned by the different parties that do not trust each others. In this paper, we proposed a novel Gradient Boosting Machines (GBM) framework SecureGBM built-up with a multi-party computation model based on semi-homomorphic encryption, where every involved party can jointly obtain a shared Gradient Boosting machines model while protecting their own data from the potential privacy leakage and inferential identification. More specific, our work focused on a specific "dual–party" secure learning scenario based on two parties – both party own an unique view (i.e., attributes or features) to the sample group of samples while only one party owns the labels. In such scenario, feature and label data are not allowed to share with others. To achieve the above goal, we firstly extent – LightGBM – a well known implementation of tree-based GBM through covering its key operations for training and inference with SEAL homomorphic encryption schemes. However, the performance of such re-implementation is significantly bottle-necked by the explosive inflation of the communication payloads, based on ciphertexts subject to the increasing length of plaintexts. In this way, we then proposed to use stochastic approximation techniques to reduced the communication payloads while accelerating the overall training procedure in a statistical manner. Our experiments using the real-world data showed that SecureGBM can well secure the communication and computation of LightGBM training and inference procedures for the both parties while only losing less than 3 boosting, on a wide range of benchmark datasets.

READ FULL TEXT

page 1

page 5

research
02/07/2022

Scalable Multi-Party Privacy-Preserving Gradient Tree Boosting over Vertically Partitioned Dataset with Outsourced Computations

Due to privacy concerns, multi-party gradient tree boosting algorithms h...
research
11/11/2019

Practical Federated Gradient Boosting Decision Trees

Gradient Boosting Decision Trees (GBDTs) have become very successful in ...
research
10/06/2020

Secure Collaborative Training and Inference for XGBoost

In recent years, gradient boosted decision tree learning has proven to b...
research
03/01/2023

SMPC Task Decomposition: A Theory for Accelerating Secure Multi-party Computation Task

Today, we are in the era of big data, and data are becoming more and mor...
research
07/30/2020

SMAP: A Joint Dimensionality Reduction Scheme for Secure Multi-Party Visualization

Nowadays, as data becomes increasingly complex and distributed, data ana...
research
04/14/2021

Multi-Party Dual Learning

The performance of machine learning algorithms heavily relies on the ava...
research
08/07/2022

CoVault: A Secure Analytics Platform

In a secure analytics platform, data sources consent to the exclusive us...

Please sign up or login with your details

Forgot password? Click here to reset