Simple, Fast, and Flexible Framework for Matrix Completion with Infinite Width Neural Networks

Matrix completion problems arise in many applications including recommendation systems, computer vision, and genomics. Increasingly larger neural networks have been successful in many of these applications, but at considerable computational costs. Remarkably, taking the width of a neural network to infinity allows for improved computational performance. In this work, we develop an infinite width neural network framework for matrix completion that is simple, fast, and flexible. Simplicity and speed come from the connection between the infinite width limit of neural networks and kernels known as neural tangent kernels (NTK). In particular, we derive the NTK for fully connected and convolutional neural networks for matrix completion. The flexibility stems from a feature prior, which allows encoding relationships between coordinates of the target matrix, akin to semi-supervised learning. The effectiveness of our framework is demonstrated through competitive results for virtual drug screening and image inpainting/reconstruction. We also provide an implementation in Python to make our framework accessible on standard hardware to a broad audience.

READ FULL TEXT

page 8

page 10

page 37

page 38

research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
02/12/2020

Finiteness of fibers in matrix completion via Plücker coordinates

Let Ω⊆{1,...,m}×{1,...,n}. We consider fibers of coordinate projections ...
research
05/13/2018

Extendable Neural Matrix Completion

Matrix completion is one of the key problems in signal processing and ma...
research
04/22/2017

Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks

Matrix completion models are among the most common formulations of recom...
research
01/21/2020

On the infinite width limit of neural networks with a standard parameterization

There are currently two parameterizations used to derive fixed kernels c...
research
06/19/2022

Geometric Matrix Completion via Sylvester Multi-Graph Neural Network

Despite the success of the Sylvester equation empowered methods on vario...
research
10/22/2022

Deep Linear Networks for Matrix Completion – An Infinite Depth Limit

The deep linear network (DLN) is a model for implicit regularization in ...

Please sign up or login with your details

Forgot password? Click here to reset