Fast optimization of common basis for matrix set through Common Singular Value Decomposition

04/18/2022
by   Jarek Duda, et al.
0

SVD (singular value decomposition) is one of the basic tools of machine learning, allowing to optimize basis for a given matrix. However, sometimes we have a set of matrices {A_k}_k instead, and would like to optimize a single common basis for them: find orthogonal matrices U, V, such that {U^T A_k V} set of matrices is somehow simpler. For example DCT-II is orthonormal basis of functions commonly used in image/video compression - as discussed here, this kind of basis can be quickly automatically optimized for a given dataset. While also discussed gradient descent optimization might be computationally costly, there is proposed CSVD (common SVD): fast general approach based on SVD. Specifically, we choose U as built of eigenvectors of ∑_i (w_k)^q (A_k A_k^T)^p and V of ∑_k (w_k)^q (A_k^T A_k)^p, where w_k are their weights, p,q>0 are some chosen powers e.g. 1/2, optionally with normalization e.g. A → A - rc^T where r_i=∑_j A_ij, c_j =∑_i A_ij.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset