S++: A Fast and Deployable Secure-Computation Framework for Privacy-Preserving Neural Network Training

01/28/2021
by   Prashanthi Ramachandran, et al.
77

We introduce S++, a simple, robust, and deployable framework for training a neural network (NN) using private data from multiple sources, using secret-shared secure function evaluation. In short, consider a virtual third party to whom every data-holder sends their inputs, and which computes the neural network: in our case, this virtual third party is actually a set of servers which individually learn nothing, even with a malicious (but non-colluding) adversary. Previous work in this area has been limited to just one specific activation function: ReLU, rendering the approach impractical for many use-cases. For the first time, we provide fast and verifiable protocols for all common activation functions and optimize them for running in a secret-shared manner. The ability to quickly, verifiably, and robustly compute exponentiation, softmax, sigmoid, etc., allows us to use previously written NNs without modification, vastly reducing developer effort and complexity of code. In recent times, ReLU has been found to converge much faster and be more computationally efficient as compared to non-linear functions like sigmoid or tanh. However, we argue that it would be remiss not to extend the mechanism to non-linear functions such as the logistic sigmoid, tanh, and softmax that are fundamental due to their ability to express outputs as probabilities and their universal approximation property. Their contribution in RNNs and a few recent advancements also makes them more relevant.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2022

Bicoptor: Two-round Secure Three-party Non-linear Computation without Preprocessing for Privacy-preserving Machine Learning

The overhead of non-linear functions dominates the performance of the se...
research
04/22/2021

CryptGPU: Fast Privacy-Preserving Machine Learning on the GPU

We introduce CryptGPU, a system for privacy-preserving machine learning ...
research
06/08/2020

ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing

We propose ARIANN, a low-interaction framework to perform private traini...
research
11/17/2022

Securer and Faster Privacy-Preserving Distributed Machine Learning

With the development of machine learning, it is difficult for a single s...
research
06/14/2023

Fast and Private Inference of Deep Neural Networks by Co-designing Activation Functions

Machine Learning as a Service (MLaaS) is an increasingly popular design ...
research
02/15/2016

Secure Approximation Guarantee for Cryptographically Private Empirical Risk Minimization

Privacy concern has been increasingly important in many machine learning...
research
06/13/2022

Deploying Convolutional Networks on Untrusted Platforms Using 2D Holographic Reduced Representations

Due to the computational cost of running inference for a neural network,...

Please sign up or login with your details

Forgot password? Click here to reset