Privacy and Integrity Preserving Training Using Trusted Hardware

05/01/2021
by   Hanieh Hashemi, et al.
0

Privacy and security-related concerns are growing as machine learning reaches diverse application domains. The data holders want to train with private data while exploiting accelerators, such as GPUs, that are hosted in the cloud. However, Cloud systems are vulnerable to attackers that compromise the privacy of data and integrity of computations. This work presents DarKnight, a framework for large DNN training while protecting input privacy and computation integrity. DarKnight relies on cooperative execution between trusted execution environments (TEE) and accelerators, where the TEE provides privacy and integrity verification, while accelerators perform the computation heavy linear algebraic operations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2022

DarKnight: An Accelerated Framework for Privacy and Integrity Preserving Deep Learning Using Trusted Hardware

Privacy and security-related concerns are growing as machine learning re...
research
09/07/2022

SAGE: Software-based Attestation for GPU Execution

With the application of machine learning to security-critical and sensit...
research
02/26/2019

PubSub-SGX: Exploiting Trusted Execution Environments for Privacy-Preserving Publish/Subscribe Systems

This paper presents PUBSUB-SGX, a content-based publish-subscribe system...
research
06/01/2020

DarKnight: A Data Privacy Scheme for Training and Inference of Deep Neural Networks

Protecting the privacy of input data is of growing importance as machine...
research
02/08/2023

Parma: Confidential Containers via Attested Execution Policies

Container-based technologies empower cloud tenants to develop highly por...
research
11/01/2022

Empowering Data Centers for Next Generation Trusted Computing

Modern data centers have grown beyond CPU nodes to provide domain-specif...

Please sign up or login with your details

Forgot password? Click here to reset