Multiplicative Updates for Elastic Net Regularized Convolutional NMF Under β-Divergence

03/14/2018
by   Pedro J. Villasana T., et al.
0

We generalize the convolutional NMF by taking the β-divergence as the loss function, add a regularizer for sparsity in the form of an elastic net, and provide multiplicative update rules for its factors in closed form. The new update rules embed the β-NMF, the standard convolutional NMF, and sparse coding alias basis pursuit. We demonstrate that the originally published update rules for the convolutional NMF are suboptimal and that their convergence rate depends on the size of the kernel.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2018

Exact multiplicative updates for convolutional β-NMF in 2D

In this paper, we extend the β-CNMF to two dimensions and derive exact m...
research
10/30/2020

Multiplicative Updates for NMF with β-Divergences under Disjoint Equality Constraints

Nonnegative matrix factorization (NMF) is the problem of approximating a...
research
09/04/2016

A Unified Convergence Analysis of the Multiplicative Update Algorithm for Regularized Nonnegative Matrix Factorization

The multiplicative update (MU) algorithm has been extensively used to es...
research
03/05/2018

Relative Pairwise Relationship Constrained Non-negative Matrix Factorisation

Non-negative Matrix Factorisation (NMF) has been extensively used in mac...
research
03/02/2012

Fast learning rate of multiple kernel learning: Trade-off between sparsity and smoothness

We investigate the learning rate of multiple kernel learning (MKL) with ...
research
07/04/2019

Blind Audio Source Separation with Minimum-Volume Beta-Divergence NMF

Considering a mixed signal composed of various audio sources and recorde...

Please sign up or login with your details

Forgot password? Click here to reset