Regularized EM Algorithms: A Unified Framework and Statistical Guarantees

11/27/2015
by   Xinyang Yi, et al.
0

Latent variable models are a fundamental modeling tool in machine learning applications, but they present significant computational and analytical challenges. The popular EM algorithm and its variants, is a much used algorithmic tool; yet our rigorous understanding of its performance is highly incomplete. Recently, work in Balakrishnan et al. (2014) has demonstrated that for an important class of problems, EM exhibits linear local convergence. In the high-dimensional setting, however, the M-step may not be well defined. We address precisely this setting through a unified treatment using regularization. While regularization for high-dimensional problems is by now well understood, the iterative EM algorithm requires a careful balancing of making progress towards the solution while identifying the right structure (e.g., sparsity or low-rank). In particular, regularizing the M-step using the state-of-the-art high-dimensional prescriptions (e.g., Wainwright (2014)) is not guaranteed to provide this balance. Our algorithm and analysis are linked in a way that reveals the balance between optimization and statistical errors. We specialize our general framework to sparse gaussian mixture models, high-dimensional mixed regression, and regression with missing variables, obtaining statistical guarantees for each of these examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2018

Global Convergence of EM Algorithm for Mixtures of Two Component Linear Regression

The Expectation-Maximization algorithm is perhaps the most broadly used ...
research
08/03/2023

The use of the EM algorithm for regularization problems in high-dimensional linear mixed-effects models

The EM algorithm is a popular tool for maximum likelihood estimation but...
research
08/09/2014

Statistical guarantees for the EM algorithm: From population to sample-based analysis

We develop a general framework for proving rigorous guarantees on the pe...
research
08/07/2016

Statistical Guarantees for Estimating the Centers of a Two-component Gaussian Mixture by EM

Recently, a general method for analyzing the statistical accuracy of the...
research
10/22/2020

Differentially Private (Gradient) Expectation Maximization Algorithm with Statistical Guarantees

(Gradient) Expectation Maximization (EM) is a widely used algorithm for ...
research
04/24/2019

Horseshoe Regularization for Machine Learning in Complex and Deep Models

Since the advent of the horseshoe priors for regularization, global-loca...
research
03/02/2010

A Unified Algorithmic Framework for Multi-Dimensional Scaling

In this paper, we propose a unified algorithmic framework for solving ma...

Please sign up or login with your details

Forgot password? Click here to reset