DeepAI AI Chat
Log In Sign Up

Incremental ELMVIS for unsupervised learning

by   Anton Akusok, et al.

An incremental version of the ELMVIS+ method is proposed in this paper. It iteratively selects a few best fitting data samples from a large pool, and adds them to the model. The method keeps high speed of ELMVIS+ while allowing for much larger possible sample pools due to lower memory requirements. The extension is useful for reaching a better local optimum with greedy optimization of ELMVIS, and the data structure can be specified in semi-supervised optimization. The major new application of incremental ELMVIS is not to visualization, but to a general dataset processing. The method is capable of learning dependencies from non-organized unsupervised data – either reconstructing a shuffled dataset, or learning dependencies in complex high-dimensional space. The results are interesting and promising, although there is space for improvements.


page 8

page 9


Semi-supervised Embedding Learning for High-dimensional Bayesian Optimization

Bayesian optimization is a broadly applied methodology to optimize the e...

Point Location in Incremental Planar Subdivisions

We study the point location problem in incremental (possibly disconnecte...

Incremental Semantic Mapping with Unsupervised On-line Learning

This paper introduces an incremental semantic mapping approach, with on-...

Generation and frame characteristics of predefined evenly-distributed class centroids for pattern classification

Predefined evenly-distributed class centroids (PEDCC) can be widely used...

Speech Augmentation Based Unsupervised Learning for Keyword Spotting

In this paper, we investigated a speech augmentation based unsupervised ...

Self Organizing Nebulous Growths for Robust and Incremental Data Visualization

Non-parametric dimensionality reduction techniques, such as t-SNE and UM...