Attention-Based Clustering: Learning a Kernel from Context

10/02/2020
by   Samuel Coward, et al.
0

In machine learning, no data point stands alone. We believe that context is an underappreciated concept in many machine learning methods. We propose Attention-Based Clustering (ABC), a neural architecture based on the attention mechanism, which is designed to learn latent representations that adapt to context within an input set, and which is inherently agnostic to input sizes and number of clusters. By learning a similarity kernel, our method directly combines with any out-of-the-box kernel-based clustering approach. We present competitive results for clustering Omniglot characters and include analytical evidence of the effectiveness of an attention-based approach for clustering.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2021

Deep Visual Attention-Based Transfer Clustering

In this paper, we propose a methodology to improvise the technique of de...
research
05/21/2019

Clustering with Similarity Preserving

Graph-based clustering has shown promising performance in many tasks. A ...
research
10/22/2018

A Fully Attention-Based Information Retriever

Recurrent neural networks are now the state-of-the-art in natural langua...
research
11/27/2017

Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification

Recently, substantial research effort has focused on how to apply CNNs o...
research
01/25/2022

Attention-Based Vandalism Detection in OpenStreetMap

OpenStreetMap (OSM), a collaborative, crowdsourced Web map, is a unique ...
research
12/27/2019

Efficient Data Analytics on Augmented Similarity Triplets

Many machine learning methods (classification, clustering, etc.) start w...
research
08/28/2019

Similarity Kernel and Clustering via Random Projection Forests

Similarity plays a fundamental role in many areas, including data mining...

Please sign up or login with your details

Forgot password? Click here to reset