AIR-Net: Adaptive and Implicit Regularization Neural Network for Matrix Completion

10/12/2021
by   Zhemin Li, et al.
0

Conventionally, the matrix completion (MC) model aims to recover a matrix from partially observed elements. Accurate recovery necessarily requires a regularization encoding priors of the unknown matrix/signal properly. However, encoding the priors accurately for the complex natural signal is difficult, and even then, the model might not generalize well outside the particular matrix type. This work combines adaptive and implicit low-rank regularization that captures the prior dynamically according to the current recovered matrix. Furthermore, we aim to answer the question: how does adaptive regularization affect implicit regularization? We utilize neural networks to represent Adaptive and Implicit Regularization and named the proposed model AIR-Net. Theoretical analyses show that the adaptive part of the AIR-Net enhances implicit regularization. In addition, the adaptive regularizer vanishes at the end, thus can avoid saturation issues. Numerical experiments for various data demonstrate the effectiveness of AIR-Net, especially when the locations of missing elements are not randomly chosen. With complete flexibility to select neural networks for matrix representation, AIR-Net can be extended to solve more general inverse problems.

READ FULL TEXT

page 6

page 8

page 11

research
08/11/2022

Adaptive and Implicit Regularization for Matrix Completion

The explicit low-rank regularization, e.g., nuclear norm regularization,...
research
07/29/2020

A regularized deep matrix factorized model of matrix completion for image restoration

It has been an important approach of using matrix completion to perform ...
research
05/13/2018

Extendable Neural Matrix Completion

Matrix completion is one of the key problems in signal processing and ma...
research
06/25/2015

Completing Low-Rank Matrices with Corrupted Samples from Few Coefficients in General Basis

Subspace recovery from corrupted and missing data is crucial for various...
research
11/22/2019

Matrix Completion from Quantized Samples via Generalized Sparse Bayesian Learning

The recovery of a low rank matrix from a subset of noisy low-precision q...
research
10/22/2022

Deep Linear Networks for Matrix Completion – An Infinite Depth Limit

The deep linear network (DLN) is a model for implicit regularization in ...
research
03/27/2023

Regularize implicit neural representation by itself

This paper proposes a regularizer called Implicit Neural Representation ...

Please sign up or login with your details

Forgot password? Click here to reset