DeepAI
Log In Sign Up

Differentially Private Algorithms for Learning Mixtures of Separated Gaussians

09/09/2019
by   Gautam Kamath, et al.
0

Learning the parameters of a Gaussian mixtures models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry. Our algorithm has two key properties not achieved by prior work: (1) The algorithm's sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters. (2) The algorithm does not require strong a priori bounds on the parameters of the mixture components.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/01/2018

Privately Learning High-Dimensional Distributions

We design nearly optimal differentially private algorithms for learning ...
02/27/2019

Private Center Points and Learning of Halfspaces

We present a private learner for halfspaces over an arbitrary finite dom...
08/16/2022

Private Estimation with Public Data

We initiate the study of differentially private (DP) estimation with acc...
11/30/2020

Robust and Private Learning of Halfspaces

In this work, we study the trade-off between differential privacy and ad...
03/27/2018

Privacy-preserving Prediction

Ensuring differential privacy of models learned from sensitive user data...
08/15/2022

Archimedes Meets Privacy: On Privately Estimating Quantiles in High Dimensions Under Minimal Assumptions

The last few years have seen a surge of work on high dimensional statist...
07/27/2022

Differentially Private Learning of Hawkes Processes

Hawkes processes have recently gained increasing attention from the mach...