Dropout as a Low-Rank Regularizer for Matrix Factorization

10/13/2017
by   Jacopo Cavazza, et al.
0

Regularization for matrix factorization (MF) and approximation problems has been carried out in many different ways. Due to its popularity in deep learning, dropout has been applied also for this class of problems. Despite its solid empirical performance, the theoretical properties of dropout as a regularizer remain quite elusive for this class of problems. In this paper, we present a theoretical analysis of dropout for MF, where Bernoulli random variables are used to drop columns of the factors. We demonstrate the equivalence between dropout and a fully deterministic model for MF in which the factors are regularized by the sum of the product of squared Euclidean norms of the columns. Additionally, we inspect the case of a variable sized factorization and we prove that dropout achieves the global minimum of a convex approximation problem with (squared) nuclear norm regularization. As a result, we conclude that dropout can be used as a low-rank regularizer with data dependent singular-value thresholding.

READ FULL TEXT
research
10/10/2017

An Analysis of Dropout for Matrix Factorization

Dropout is a simple yet effective algorithm for regularizing neural netw...
research
05/28/2019

On Dropout and Nuclear Norm Regularization

We give a formal and complete characterization of the explicit regulariz...
research
10/30/2019

On the Regularization Properties of Structured Dropout

Dropout and its extensions (eg. DropBlock and DropConnect) are popular h...
research
03/06/2020

Dropout: Explicit Forms and Capacity Control

We investigate the capacity control provided by dropout in various machi...
research
05/05/2020

Adaptive Low-Rank Factorization to regularize shallow and deep neural networks

The overfitting is one of the cursing subjects in the deep learning fiel...
research
01/08/2020

A Group Norm Regularized LRR Factorization Model for Spectral Clustering

Spectral clustering is a very important and classic graph clustering met...
research
01/23/2022

Weight Expansion: A New Perspective on Dropout and Generalization

While dropout is known to be a successful regularization technique, insi...

Please sign up or login with your details

Forgot password? Click here to reset