Faster Rectangular Matrix Multiplication by Combination Loss Analysis

07/13/2023
by   François Le Gall, et al.
0

Duan, Wu and Zhou (FOCS 2023) recently obtained the improved upper bound on the exponent of square matrix multiplication ω<2.3719 by introducing a new approach to quantify and compensate the “combination loss" in prior analyses of powers of the Coppersmith-Winograd tensor. In this paper we show how to use this new approach to improve the exponent of rectangular matrix multiplication as well. Our main technical contribution is showing how to combine this analysis of the combination loss and the analysis of the fourth power of the Coppersmith-Winograd tensor in the context of rectangular matrix multiplication developed by Le Gall and Urrutia (SODA 2018).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2023

New Bounds for Matrix Multiplication: from Alpha to Omega

The main contribution of this paper is a new improved variant of the las...
research
10/18/2022

Faster Matrix Multiplication via Asymmetric Hashing

Fast matrix multiplication is one of the most fundamental problems in al...
research
07/12/2023

Efficiently-Verifiable Strong Uniquely Solvable Puzzles and Matrix Multiplication

We advance the Cohn-Umans framework for developing fast matrix multiplic...
research
11/13/2017

Grothendieck constant is norm of Strassen matrix multiplication tensor

We show that two important quantities from two disparate areas of comple...
research
05/07/2019

Even Faster Elastic-Degenerate String Matching via Fast Matrix Multiplication

An elastic-degenerate (ED) string is a sequence of n sets of strings of ...
research
01/30/2019

Egyptian multiplication and some of its ramifications

Fast exponentiation with an integer exponent relies on squaring of the b...

Please sign up or login with your details

Forgot password? Click here to reset