Matrix factorization with neural networks

12/05/2022
by   Francesco Camilli, et al.
0

Matrix factorization is an important mathematical problem encountered in the context of dictionary learning, recommendation systems and machine learning. We introduce a new `decimation' scheme that maps it to neural network models of associative memory and provide a detailed theoretical analysis of its performance, showing that decimation is able to factorize extensive-rank matrices and to denoise them efficiently. We introduce a decimation algorithm based on ground-state search of the neural network, which shows performances that match the theoretical prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2023

The Decimation Scheme for Symmetric Matrix Factorization

Matrix factorization is an inference problem that has acquired importanc...
research
04/16/2018

Binary Matrix Factorization via Dictionary Learning

Matrix factorization is a key tool in data analysis; its applications in...
research
11/13/2013

Sparse Matrix Factorization

We investigate the problem of factorizing a matrix into several sparse m...
research
02/13/2021

Learning low-rank latent mesoscale structures in networks

It is common to use networks to encode the architecture of interactions ...
research
09/01/2016

Understanding Trainable Sparse Coding via Matrix Factorization

Sparse coding is a core building block in many data analysis and machine...
research
06/02/2017

Understanding the Learned Iterative Soft Thresholding Algorithm with matrix factorization

Sparse coding is a core building block in many data analysis and machine...
research
03/02/2015

A Hebbian/Anti-Hebbian Network for Online Sparse Dictionary Learning Derived from Symmetric Matrix Factorization

Olshausen and Field (OF) proposed that neural computations in the primar...

Please sign up or login with your details

Forgot password? Click here to reset