DeepAI AI Chat
Log In Sign Up

Incremental ELMVIS for unsupervised learning

12/18/2019
by   Anton Akusok, et al.
0

An incremental version of the ELMVIS+ method is proposed in this paper. It iteratively selects a few best fitting data samples from a large pool, and adds them to the model. The method keeps high speed of ELMVIS+ while allowing for much larger possible sample pools due to lower memory requirements. The extension is useful for reaching a better local optimum with greedy optimization of ELMVIS, and the data structure can be specified in semi-supervised optimization. The major new application of incremental ELMVIS is not to visualization, but to a general dataset processing. The method is capable of learning dependencies from non-organized unsupervised data – either reconstructing a shuffled dataset, or learning dependencies in complex high-dimensional space. The results are interesting and promising, although there is space for improvements.

READ FULL TEXT

page 8

page 9

05/29/2020

Semi-supervised Embedding Learning for High-dimensional Bayesian Optimization

Bayesian optimization is a broadly applied methodology to optimize the e...
09/27/2018

Point Location in Incremental Planar Subdivisions

We study the point location problem in incremental (possibly disconnecte...
07/09/2019

Incremental Semantic Mapping with Unsupervised On-line Learning

This paper introduces an incremental semantic mapping approach, with on-...
05/02/2021

Generation and frame characteristics of predefined evenly-distributed class centroids for pattern classification

Predefined evenly-distributed class centroids (PEDCC) can be widely used...
05/28/2022

Speech Augmentation Based Unsupervised Learning for Keyword Spotting

In this paper, we investigated a speech augmentation based unsupervised ...
12/09/2019

Self Organizing Nebulous Growths for Robust and Incremental Data Visualization

Non-parametric dimensionality reduction techniques, such as t-SNE and UM...