Differentially Private Secure Multi-Party Computation for Federated Learning in Financial Applications

10/12/2020
by   David Byrd, et al.
0

Federated Learning enables a population of clients, working with a trusted server, to collaboratively learn a shared machine learning model while keeping each client's data within its own local systems. This reduces the risk of exposing sensitive data, but it is still possible to reverse engineer information about a client's private data set from communicated model parameters. Most federated learning systems therefore use differential privacy to introduce noise to the parameters. This adds uncertainty to any attempt to reveal private client data, but also reduces the accuracy of the shared model, limiting the useful scale of privacy-preserving noise. A system can further reduce the coordinating server's ability to recover private client information, without additional accuracy loss, by also including secure multiparty computation. An approach combining both techniques is especially relevant to financial firms as it allows new possibilities for collaborative learning without exposing sensitive client data. This could produce more accurate models for important tasks like optimal trade execution, credit origination, or fraud detection. The key contributions of this paper are: We present a privacy-preserving federated learning protocol to a non-specialist audience, demonstrate it using logistic regression on a real-world credit card fraud data set, and evaluate it using an open-source simulation platform which we have adapted for the development of federated learning systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2022

Collusion Resistant Federated Learning with Oblivious Distributed Differential Privacy

Privacy-preserving federated learning enables a population of distribute...
research
05/09/2023

Privacy-Preserving Collaborative Chinese Text Recognition with Federated Learning

In Chinese text recognition, to compensate for the insufficient local da...
research
11/06/2019

Secure Federated Submodel Learning

Federated learning was proposed with an intriguing vision of achieving c...
research
07/30/2023

Shuffled Differentially Private Federated Learning for Time Series Data Analytics

Trustworthy federated learning aims to achieve optimal performance while...
research
02/22/2022

Differential Secrecy for Distributed Data and Applications to Robust Differentially Secure Vector Summation

Computing the noisy sum of real-valued vectors is an important primitive...
research
06/12/2023

AnoFel: Supporting Anonymity for Privacy-Preserving Federated Learning

Federated learning enables users to collaboratively train a machine lear...
research
11/23/2018

Dancing in the Dark: Private Multi-Party Machine Learning in an Untrusted Setting

Distributed machine learning (ML) systems today use an unsophisticated t...

Please sign up or login with your details

Forgot password? Click here to reset