Fast Differentiable Matrix Square Root and Inverse Square Root

01/29/2022
by   Yue Song, et al.
12

Computing the matrix square root and its inverse in a differentiable manner is important in a variety of computer vision tasks. Previous methods either adopt the Singular Value Decomposition (SVD) to explicitly factorize the matrix or use the Newton-Schulz iteration (NS iteration) to derive the approximate solution. However, both methods are not computationally efficient enough in either the forward pass or the backward pass. In this paper, we propose two more efficient variants to compute the differentiable matrix square root and the inverse square root. For the forward propagation, one method is to use Matrix Taylor Polynomial (MTP), and the other method is to use Matrix Pad\'e Approximants (MPA). The backward gradient is computed by iteratively solving the continuous-time Lyapunov equation using the matrix sign function. A series of numerical tests show that both methods yield considerable speed-up compared with the SVD or the NS iteration. Moreover, we validate the effectiveness of our methods in several real-world applications, including de-correlated batch normalization, second-order vision transformer, global covariance pooling for large-scale and fine-grained recognition, attentive covariance pooling for video recognition, and neural style transfer. The experimental results demonstrate that our methods can also achieve competitive and even slightly better performances. The Pytorch implementation is available at \href{https://github.com/KingJamesSong/FastDifferentiableMatSqrt}{https://github.com/KingJamesSong/FastDifferentiableMatSqrt}.

READ FULL TEXT

page 9

page 12

page 15

research
01/21/2022

Fast Differentiable Matrix Square Root

Computing the matrix square root or its inverse in a differentiable mann...
research
05/06/2021

Why Approximate Matrix Square Root Outperforms Accurate SVD in Global Covariance Pooling?

Global covariance pooling (GCP) aims at exploiting the second-order stat...
research
12/11/2022

Orthogonal SVD Covariance Conditioning and Latent Disentanglement

Inserting an SVD meta-layer into neural networks is prone to make the co...
research
04/08/2021

Robust Differentiable SVD

Eigendecomposition of symmetric matrices is at the heart of many compute...
research
06/05/2019

Compact Approximation for Polynomial of Covariance Feature

Covariance pooling is a feature pooling method with good classification ...
research
04/16/2018

Block Mean Approximation for Efficient Second Order Optimization

Advanced optimization algorithms such as Newton method and AdaGrad benef...

Please sign up or login with your details

Forgot password? Click here to reset