Second order stochastic gradient update for Cholesky factor in Gaussian variational approximation from Stein's Lemma
In stochastic variational inference, use of the reparametrization trick for the multivariate Gaussian gives rise to efficient updates for the mean and Cholesky factor of the covariance matrix, which depend on the first order derivative of the log joint model density. In this article, we show that an alternative unbiased gradient estimate for the Cholesky factor which depends on the second order derivative of the log joint model density can be derived using Stein's Lemma. This leads to a second order stochastic gradient update for the Cholesky factor which is able to improve convergence, as it has variance lower than the first order update (almost negligible) when close to the mode. We also derive second order update for the Cholesky factor of the precision matrix, which is useful when the precision matrix has a sparse structure reflecting conditional independence in the true posterior distribution. Our results can be used to obtain second order natural gradient updates for the Cholesky factor as well, which are more robust compared to updates based on Euclidean gradients.
READ FULL TEXT