Fast Approximation of Rotations and Hessians matrices

04/29/2014
by   Michael Mathieu, et al.
0

A new method to represent and approximate rotation matrices is introduced. The method represents approximations of a rotation matrix Q with linearithmic complexity, i.e. with 1/2n(n) rotations over pairs of coordinates, arranged in an FFT-like fashion. The approximation is "learned" using gradient descent. It allows to represent symmetric matrices H as QDQ^T where D is a diagonal matrix. It can be used to approximate covariance matrix of Gaussian models in order to speed up inference, or to estimate and track the inverse Hessian of an objective function by relating changes in parameters to changes in gradient along the trajectory followed by the optimization procedure. Experiments were conducted to approximate synthetic matrices, covariance matrices of real data, and Hessian matrices of objective functions involved in machine learning problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset