Metalearning: Sparse Variable-Structure Automata

01/30/2021
by   Pedram Fekri, et al.
0

Dimension of the encoder output (i.e., the code layer) in an autoencoder is a key hyper-parameter for representing the input data in a proper space. This dimension must be carefully selected in order to guarantee the desired reconstruction accuracy. Although overcomplete representation can address this dimension issue, the computational complexity will increase with dimension. Inspired by non-parametric methods, here, we propose a metalearning approach to increase the number of basis vectors used in dynamic sparse coding on the fly. An actor-critic algorithm is deployed to automatically choose an appropriate dimension for feature vectors regarding the required level of accuracy. The proposed method benefits from online dictionary learning and fast iterative shrinkage-thresholding algorithm (FISTA) as the optimizer in the inference phase. It aims at choosing the minimum number of bases for the overcomplete representation regarding the reconstruction error threshold. This method allows for online controlling of both the representation dimension and the reconstruction error in a dynamic framework.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 7

12/21/2021

Learned ISTA with Error-based Thresholding for Adaptive Sparse Coding

The learned iterative shrinkage thresholding algorithm (LISTA) introduce...
05/04/2021

Online adaptive basis construction for nonlinear model reduction through local error optimization

The accuracy of the reduced-order model (ROM) mainly depends on the sele...
03/12/2019

Confident Kernel Sparse Coding and Dictionary Learning

In recent years, kernel-based sparse coding (K-SRC) has received particu...
01/22/2017

Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

In this paper, we focus on online representation learning in non-station...
08/12/2017

Sparse Coding and Autoencoders

In "Dictionary Learning" one tries to recover incoherent matrices A^* ∈R...
02/18/2012

On the Sample Complexity of Predictive Sparse Coding

The goal of predictive sparse coding is to learn a representation of exa...
10/14/2020

Data compression to choose a proper dynamic network representation

Dynamic network data are now available in a wide range of contexts and d...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.