Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

by   Sahil Garg, et al.

In this paper, we focus on online representation learning in non-stationary environments which may require continuous adaptation of model architecture. We propose a novel online dictionary-learning (sparse-coding) framework which incorporates the addition and deletion of hidden units (dictionary elements), and is inspired by the adult neurogenesis phenomenon in the dentate gyrus of the hippocampus, known to be associated with improved cognitive function and adaptation to new environments. In the online learning setting, where new input instances arrive sequentially in batches, the neuronal-birth is implemented by adding new units with random initial weights (random dictionary elements); the number of new units is determined by the current performance (representation error) of the dictionary, higher error causing an increase in the birth rate. Neuronal-death is implemented by imposing l1/l2-regularization (group sparsity) on the dictionary within the block-coordinate descent optimization at each iteration of our online alternating minimization scheme, which iterates between the code and dictionary updates. Finally, hidden unit connectivity adaptation is facilitated by introducing sparsity in dictionary elements. Our empirical evaluation on several real-life datasets (images and language) as well as on synthetic data demonstrates that the proposed approach can considerably outperform the state-of-art fixed-size (nonadaptive) online sparse coding of Mairal et al. (2009) in the presence of nonstationary data. Moreover, we identify certain properties of the data (e.g., sparse inputs with nearly non-overlapping supports) and of the model (e.g., dictionary sparsity) associated with such improvements.


page 8

page 14

page 23

page 24


Online Orthogonal Dictionary Learning Based on Frank-Wolfe Method

Dictionary learning is a widely used unsupervised learning method in sig...

Online Convex Dictionary Learning

Dictionary learning is a dimensionality reduction technique widely used ...

Bayesian sparsity and class sparsity priors for dictionary learning and coding

Dictionary learning methods continue to gain popularity for the solution...

Dictionary Learning with BLOTLESS Update

Algorithms for learning a dictionary under which a data in a given set h...

Metalearning: Sparse Variable-Structure Automata

Dimension of the encoder output (i.e., the code layer) in an autoencoder...

Monitoring Targeted Hate in Online Environments

Hateful comments, swearwords and sometimes even death threats are becomi...

Toward a Robust Sparse Data Representation for Wireless Sensor Networks

Compressive sensing has been successfully used for optimized operations ...

Please sign up or login with your details

Forgot password? Click here to reset