DarKnight: An Accelerated Framework for Privacy and Integrity Preserving Deep Learning Using Trusted Hardware

06/30/2022
by   Hanieh Hashemi, et al.
0

Privacy and security-related concerns are growing as machine learning reaches diverse application domains. The data holders want to train or infer with private data while exploiting accelerators, such as GPUs, that are hosted in the cloud. Cloud systems are vulnerable to attackers that compromise the privacy of data and integrity of computations. Tackling such a challenge requires unifying theoretical privacy algorithms with hardware security capabilities. This paper presents DarKnight, a framework for large DNN training while protecting input privacy and computation integrity. DarKnight relies on cooperative execution between trusted execution environments (TEE) and accelerators, where the TEE provides privacy and integrity verification, while accelerators perform the bulk of the linear algebraic computation to optimize the performance. In particular, DarKnight uses a customized data encoding strategy based on matrix masking to create input obfuscation within a TEE. The obfuscated data is then offloaded to GPUs for fast linear algebraic computation. DarKnight's data obfuscation strategy provides provable data privacy and computation integrity in the cloud servers. While prior works tackle inference privacy and cannot be utilized for training, DarKnight's encoding scheme is designed to support both training and inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/01/2021

Privacy and Integrity Preserving Training Using Trusted Hardware

Privacy and security-related concerns are growing as machine learning re...
research
06/01/2020

DarKnight: A Data Privacy Scheme for Training and Inference of Deep Neural Networks

Protecting the privacy of input data is of growing importance as machine...
research
09/07/2022

SAGE: Software-based Attestation for GPU Execution

With the application of machine learning to security-critical and sensit...
research
10/04/2021

AsymML: An Asymmetric Decomposition Framework for Privacy-Preserving DNN Training and Inference

Leveraging parallel hardware (e.g. GPUs) to conduct deep neural network ...
research
05/05/2021

Byzantine-Robust and Privacy-Preserving Framework for FedML

Federated learning has emerged as a popular paradigm for collaboratively...
research
11/01/2022

Empowering Data Centers for Next Generation Trusted Computing

Modern data centers have grown beyond CPU nodes to provide domain-specif...

Please sign up or login with your details

Forgot password? Click here to reset