Metalearning: Sparse Variable-Structure Automata

by   Pedram Fekri, et al.

Dimension of the encoder output (i.e., the code layer) in an autoencoder is a key hyper-parameter for representing the input data in a proper space. This dimension must be carefully selected in order to guarantee the desired reconstruction accuracy. Although overcomplete representation can address this dimension issue, the computational complexity will increase with dimension. Inspired by non-parametric methods, here, we propose a metalearning approach to increase the number of basis vectors used in dynamic sparse coding on the fly. An actor-critic algorithm is deployed to automatically choose an appropriate dimension for feature vectors regarding the required level of accuracy. The proposed method benefits from online dictionary learning and fast iterative shrinkage-thresholding algorithm (FISTA) as the optimizer in the inference phase. It aims at choosing the minimum number of bases for the overcomplete representation regarding the reconstruction error threshold. This method allows for online controlling of both the representation dimension and the reconstruction error in a dynamic framework.



There are no comments yet.


page 7


Learned ISTA with Error-based Thresholding for Adaptive Sparse Coding

The learned iterative shrinkage thresholding algorithm (LISTA) introduce...

Online adaptive basis construction for nonlinear model reduction through local error optimization

The accuracy of the reduced-order model (ROM) mainly depends on the sele...

Confident Kernel Sparse Coding and Dictionary Learning

In recent years, kernel-based sparse coding (K-SRC) has received particu...

Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

In this paper, we focus on online representation learning in non-station...

Sparse Coding and Autoencoders

In "Dictionary Learning" one tries to recover incoherent matrices A^* ∈R...

On the Sample Complexity of Predictive Sparse Coding

The goal of predictive sparse coding is to learn a representation of exa...

Data compression to choose a proper dynamic network representation

Dynamic network data are now available in a wide range of contexts and d...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.