On the use of self-organizing maps to accelerate vector quantization

01/04/2007
by   Eric De Bodt, et al.
0

Self-organizing maps (SOM) are widely used for their topology preservation property: neighboring input vectors are quantified (or classified) either on the same location or on neighbor ones on a predefined grid. SOM are also widely used for their more classical vector quantization property. We show in this paper that using SOM instead of the more classical Simple Competitive Learning (SCL) algorithm drastically increases the speed of convergence of the vector quantization process. This fact is demonstrated through extensive simulations on artificial and real examples, with specific SOM (fixed and decreasing neighborhoods) and SCL algorithms.

READ FULL TEXT
research
06/30/2017

Bolt: Accelerated Data Mining with Fast Vector Compression

Vectors of data are at the heart of machine learning and data mining. Re...
research
01/30/2015

Vector Quantization by Minimizing Kullback-Leibler Divergence

This paper proposes a new method for vector quantization by minimizing t...
research
05/03/2022

A unified view on Self-Organizing Maps (SOMs) and Stochastic Neighbor Embedding (SNE)

We propose a unified view on two widely used data visualization techniqu...
research
01/12/2020

Aggregated Learning: A Vector-Quantization Approach to Learning Neural Network Classifiers

We consider the problem of learning a neural network classifier. Under t...
research
09/23/2015

A review of learning vector quantization classifiers

In this work we present a review of the state of the art of Learning Vec...
research
03/01/2018

Vector Quantization as Sparse Least Square Optimization

Vector quantization aims to form new vectors/matrices with shared values...
research
01/06/2010

Accelerating Competitive Learning Graph Quantization

Vector quantization(VQ) is a lossy data compression technique from signa...

Please sign up or login with your details

Forgot password? Click here to reset