EnclaveTree: Privacy-preserving Data Stream Training and Inference Using TEE

03/02/2022
by   Qifan Wang, et al.
0

The classification service over a stream of data is becoming an important offering for cloud providers, but users may encounter obstacles in providing sensitive data due to privacy concerns. While Trusted Execution Environments (TEEs) are promising solutions for protecting private data, they remain vulnerable to side-channel attacks induced by data-dependent access patterns. We propose a Privacy-preserving Data Stream Training and Inference scheme, called EnclaveTree, that provides confidentiality for user's data and the target models against a compromised cloud service provider. We design a matrix-based training and inference procedure to train the Hoeffding Tree (HT) model and perform inference with the trained model inside the trusted area of TEEs, which provably prevent the exploitation of access-pattern-based attacks. The performance evaluation shows that EnclaveTree is practical for processing the data streams with small or medium number of features. When there are less than 63 binary features, EnclaveTree is up to 10× and 9× faster than naïve oblivious solution on training and inference, respectively.

READ FULL TEXT

page 2

page 8

page 9

page 10

page 12

page 13

page 14

page 15

research
06/17/2020

Visor: Privacy-Preserving Video Analytics as a Cloud Service

Video-analytics-as-a-service is becoming an important offering for cloud...
research
10/04/2021

AsymML: An Asymmetric Decomposition Framework for Privacy-Preserving DNN Training and Inference

Leveraging parallel hardware (e.g. GPUs) to conduct deep neural network ...
research
09/24/2019

P2FAAS: Toward Privacy-Preserving Fuzzing as a Service

Global corporations (e.g., Google and Microsoft) have recently introduce...
research
10/10/2020

Data-driven Regularized Inference Privacy

Data is used widely by service providers as input to inference systems t...
research
04/29/2021

PPFL: Privacy-preserving Federated Learning with Trusted Execution Environments

We propose and implement a Privacy-preserving Federated Learning (PPFL) ...
research
05/26/2019

Shredder: Learning Noise to Protect Privacy with Partial DNN Inference on the Edge

A wide variety of DNN applications increasingly rely on the cloud to per...
research
04/19/2023

Secure Split Learning against Property Inference, Data Reconstruction, and Feature Space Hijacking Attacks

Split learning of deep neural networks (SplitNN) has provided a promisin...

Please sign up or login with your details

Forgot password? Click here to reset