Log In Sign Up

Differentially Private Algorithms for Learning Mixtures of Separated Gaussians

by   Gautam Kamath, et al.

Learning the parameters of a Gaussian mixtures models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry. Our algorithm has two key properties not achieved by prior work: (1) The algorithm's sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters. (2) The algorithm does not require strong a priori bounds on the parameters of the mixture components.


page 1

page 2

page 3

page 4


Privately Learning High-Dimensional Distributions

We design nearly optimal differentially private algorithms for learning ...

Private Center Points and Learning of Halfspaces

We present a private learner for halfspaces over an arbitrary finite dom...

Private Estimation with Public Data

We initiate the study of differentially private (DP) estimation with acc...

Robust and Private Learning of Halfspaces

In this work, we study the trade-off between differential privacy and ad...

Privacy-preserving Prediction

Ensuring differential privacy of models learned from sensitive user data...

Archimedes Meets Privacy: On Privately Estimating Quantiles in High Dimensions Under Minimal Assumptions

The last few years have seen a surge of work on high dimensional statist...

Differentially Private Learning of Hawkes Processes

Hawkes processes have recently gained increasing attention from the mach...