VertiBayes: Learning Bayesian network parameters from vertically partitioned data with missing values

10/31/2022
by   Florian van Daalen, et al.
0

Federated learning makes it possible to train a machine learning model on decentralized data. Bayesian networks are probabilistic graphical models that have been widely used in artificial intelligence applications. Their popularity stems from the fact they can be built by combining existing expert knowledge with data and are highly interpretable, which makes them useful for decision support, e.g. in healthcare. While some research has been published on the federated learning of Bayesian networks, publications on Bayesian networks in a vertically partitioned or heterogeneous data setting (where different variables are located in different datasets) are limited, and suffer from important omissions, such as the handling of missing data. In this article, we propose a novel method called VertiBayes to train Bayesian networks (structure and parameters) on vertically partitioned data, which can handle missing values as well as an arbitrary number of parties. For structure learning we adapted the widely used K2 algorithm with a privacy-preserving scalar product protocol. For parameter learning, we use a two-step approach: first, we learn an intermediate model using maximum likelihood by treating missing values as a special value and then we train a model on synthetic data generated by the intermediate model using the EM algorithm. The privacy guarantees of our approach are equivalent to the ones provided by the privacy preserving scalar product protocol used. We experimentally show our approach produces models comparable to those learnt using traditional algorithms and we estimate the increase in complexity in terms of samples, network size, and complexity. Finally, we propose two alternative approaches to estimate the performance of the model using vertically partitioned data and we show in experiments that they lead to reasonably accurate estimates.

READ FULL TEXT
research
12/12/2019

HybridAlpha: An Efficient Approach for Privacy-Preserving Federated Learning

Federated learning has emerged as a promising approach for collaborative...
research
12/17/2021

Privacy preserving n-party scalar product protocol

Privacy-preserving machine learning enables the training of models on de...
research
08/14/2020

Privacy-Preserving Asynchronous Federated Learning Algorithms for Multi-Party Vertically Collaborative Learning

The privacy-preserving federated learning for vertically partitioned dat...
research
11/13/2022

Towards Privacy-Aware Causal Structure Learning in Federated Setting

Causal structure learning has been extensively studied and widely used i...
research
12/29/2020

Privacy-Preserving Methods for Vertically Partitioned Incomplete Data

Distributed health data networks that use information from multiple sour...
research
10/18/2021

Towards Federated Bayesian Network Structure Learning with Continuous Optimization

Traditionally, Bayesian network structure learning is often carried out ...
research
05/22/2023

Privet: A Privacy-Preserving Vertical Federated Learning Service for Gradient Boosted Decision Tables

Vertical federated learning (VFL) has recently emerged as an appealing d...

Please sign up or login with your details

Forgot password? Click here to reset