Sparse Factorization Layers for Neural Networks with Limited Supervision

12/14/2016
by   Parker Koch, et al.
0

Whereas CNNs have demonstrated immense progress in many vision problems, they suffer from a dependence on monumental amounts of labeled training data. On the other hand, dictionary learning does not scale to the size of problems that CNNs can handle, despite being very effective at low-level vision tasks such as denoising and inpainting. Recently, interest has grown in adapting dictionary learning methods for supervised tasks such as classification and inverse problems. We propose two new network layers that are based on dictionary learning: a sparse factorization layer and a convolutional sparse factorization layer, analogous to fully-connected and convolutional layers, respectively. Using our derivations, these layers can be dropped in to existing CNNs, trained together in an end-to-end fashion with back-propagation, and leverage semisupervision in ways classical CNNs cannot. We experimentally compare networks with these two new layers against a baseline CNN. Our results demonstrate that networks with either of the sparse factorization layers are able to outperform classical CNNs when supervised data are few. They also show performance improvements in certain tasks when compared to the CNN with no sparse factorization layers with the same exact number of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2020

When Dictionary Learning Meets Deep Learning: Deep Dictionary Learning and Coding Network for Image Recognition with Limited Data

We present a new Deep Dictionary Learning and Coding Network (DDLCN) for...
research
09/11/2018

Deep Micro-Dictionary Learning and Coding Network

In this paper, we propose a novel Deep Micro-Dictionary Learning and Cod...
research
09/27/2010

Task-Driven Dictionary Learning

Modeling data with linear combinations of a few elements from a learned ...
research
08/29/2017

Multi-Layer Convolutional Sparse Modeling: Pursuit and Dictionary Learning

The recently proposed Multi-Layer Convolutional Sparse Coding (ML-CSC) m...
research
02/18/2020

Deep Transform and Metric Learning Network: Wedding Deep Dictionary Learning and Neural Networks

On account of its many successes in inference tasks and denoising applic...
research
04/23/2022

Gabor is Enough: Interpretable Deep Denoising with a Gabor Synthesis Dictionary Prior

Image processing neural networks, natural and artificial, have a long hi...
research
09/22/2022

Efficient CNN with uncorrelated Bag of Features pooling

Despite the superior performance of CNN, deploying them on low computati...

Please sign up or login with your details

Forgot password? Click here to reset