Privacy and Integrity Preserving Training Using Trusted Hardware

05/01/2021
by   Hanieh Hashemi, et al.
0

Privacy and security-related concerns are growing as machine learning reaches diverse application domains. The data holders want to train with private data while exploiting accelerators, such as GPUs, that are hosted in the cloud. However, Cloud systems are vulnerable to attackers that compromise the privacy of data and integrity of computations. This work presents DarKnight, a framework for large DNN training while protecting input privacy and computation integrity. DarKnight relies on cooperative execution between trusted execution environments (TEE) and accelerators, where the TEE provides privacy and integrity verification, while accelerators perform the computation heavy linear algebraic operations.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/30/2022

DarKnight: An Accelerated Framework for Privacy and Integrity Preserving Deep Learning Using Trusted Hardware

Privacy and security-related concerns are growing as machine learning re...
09/07/2022

SAGE: Software-based Attestation for GPU Execution

With the application of machine learning to security-critical and sensit...
02/26/2019

PubSub-SGX: Exploiting Trusted Execution Environments for Privacy-Preserving Publish/Subscribe Systems

This paper presents PUBSUB-SGX, a content-based publish-subscribe system...
06/01/2020

DarKnight: A Data Privacy Scheme for Training and Inference of Deep Neural Networks

Protecting the privacy of input data is of growing importance as machine...
10/04/2021

AsymML: An Asymmetric Decomposition Framework for Privacy-Preserving DNN Training and Inference

Leveraging parallel hardware (e.g. GPUs) to conduct deep neural network ...
02/21/2020

Practical Verification of MapReduce Computation Integrity via Partial Re-execution

Big data processing is often outsourced to powerful, but untrusted cloud...