A view of Estimation of Distribution Algorithms through the lens of Expectation-Maximization

05/24/2019
by   David H. Brookes, et al.
0

We show that under mild conditions, Estimation of Distribution Algorithms (EDAs) can be written as variational Expectation-Maximization (EM) that uses a mixture of weighted particles as the approximate posterior. In the infinite particle limit, EDAs can be viewed as exact EM. Because EM sits on a rigorous statistical foundation and has been thoroughly analyzed, this connection provides a coherent framework with which to reason about EDAs. Importantly, the connection also suggests avenues for possible improvements to EDAs owing to our ability to leverage both general, and EM-specific statistical tools and generalizations. For example, we make use of results about known EM convergence properties to propose an adaptive, hybrid EDA-gradient descent algorithm; this hybrid demonstrates better performance than either component of the hybrid on several canonical, non-convex test functions. We also demonstrate empirically that although one might hypothesize that reducing the variational gap could prove useful, it actually degrades performance of EDAs. Finally, we show that the equivalence between EM and EDAs provides us with a new perspective on why EDAs are performing approximate natural gradient descent.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset