Accelerated Algorithms for Nonlinear Matrix Decomposition with the ReLU function

05/15/2023
by   Giovanni Seraghiti, et al.
0

In this paper, we study the following nonlinear matrix decomposition (NMD) problem: given a sparse nonnegative matrix X, find a low-rank matrix Θ such that X ≈ f(Θ), where f is an element-wise nonlinear function. We focus on the case where f(·) = max(0, ·), the rectified unit (ReLU) non-linear activation. We refer to the corresponding problem as ReLU-NMD. We first provide a brief overview of the existing approaches that were developed to tackle ReLU-NMD. Then we introduce two new algorithms: (1) aggressive accelerated NMD (A-NMD) which uses an adaptive Nesterov extrapolation to accelerate an existing algorithm, and (2) three-block NMD (3B-NMD) which parametrizes Θ = WH and leads to a significant reduction in the computational cost. We also propose an effective initialization strategy based on the nuclear norm as a proxy for the rank function. We illustrate the effectiveness of the proposed algorithms (available on gitlab) on synthetic and real-world data sets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset