Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

11/08/2017
by   Jianqiao Wangni, et al.
0

Sparsity inducing regularization is an important part for learning over-complete visual representations. Despite the popularity of ℓ_1 regularization, in this paper, we investigate the usage of non-convex regularizations in this problem. Our contribution consists of three parts. First, we propose the leaky capped norm regularization (LCNR), which allows model weights below a certain threshold to be regularized more strongly as opposed to those above, therefore imposes strong sparsity and only introduces controllable estimation bias. We propose a majorization-minimization algorithm to optimize the joint objective function. Second, our study over monocular 3D shape recovery and neural networks with LCNR outperforms ℓ_1 and other non-convex regularizations, achieving state-of-the-art performance and faster convergence. Third, we prove a theoretical global convergence speed on the 3D recovery problem. To the best of our knowledge, this is the first convergence analysis of the 3D recovery problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2018

Is the 1-norm the best convex sparse regularization?

The 1-norm is a good convex regularization for the recovery of sparse ve...
research
12/29/2018

Monocular 3D Pose Recovery via Nonconvex Sparsity with Theoretical Analysis

For recovering 3D object poses from 2D images, a prevalent method is to ...
research
07/30/2020

A projected gradient method for αℓ_1-βℓ_2 sparsity regularization

The non-convex α·_ℓ_1-β·_ℓ_2 (α≥β≥0) regularization has attracted attent...
research
04/04/2015

Convex Denoising using Non-Convex Tight Frame Regularization

This paper considers the problem of signal denoising using a sparse tigh...
research
07/06/2020

A Novel Regularization Based on the Error Function for Sparse Recovery

Regularization plays an important role in solving ill-posed problems by ...
research
06/17/2020

Implicit regularization for convex regularizers

We study implicit regularization for over-parameterized linear models, w...
research
08/11/2022

Super-Universal Regularized Newton Method

We analyze the performance of a variant of Newton method with quadratic ...

Please sign up or login with your details

Forgot password? Click here to reset