DeepAI AI Chat
Log In Sign Up

Confidential Machine Learning on Untrusted Platforms: A Survey

12/15/2020
by   Sagar Sharma, et al.
11

With ever-growing data and the need for developing powerful machine learning models, data owners increasingly depend on untrusted platforms (e.g., public clouds, edges, and machine learning service providers). However, sensitive data and models become susceptible to unauthorized access, misuse, and privacy compromises. Recently, a body of research has been developed to train machine learning models on encrypted outsourced data with untrusted platforms. In this survey, we summarize the studies in this emerging area with a unified framework to highlight the major challenges and approaches. We will focus on the cryptographic approaches for confidential machine learning (CML), while also covering other directions such as perturbation-based approaches and CML in the hardware-assisted confidential computing environment. The discussion will take a holistic way to consider a rich context of the related threat models, security assumptions, attacks, design philosophies, and associated trade-offs amongst data utility, cost, and confidentiality.

READ FULL TEXT
07/15/2020

A Survey of Privacy Attacks in Machine Learning

As machine learning becomes more widely used, the need to study its impl...
02/05/2022

A Survey on Poisoning Attacks Against Supervised Machine Learning

With the rise of artificial intelligence and machine learning in modern ...
07/11/2017

A Survey on Resilient Machine Learning

Machine learning based system are increasingly being used for sensitive ...
01/30/2021

Efficient CNN Building Blocks for Encrypted Data

Machine learning on encrypted data can address the concerns related to p...
11/03/2018

A Marauder's Map of Security and Privacy in Machine Learning

There is growing recognition that machine learning (ML) exposes new secu...
05/28/2021

Data Acquisition for Improving Machine Learning Models

The vast advances in Machine Learning over the last ten years have been ...