Log In Sign Up

Message Passing Adaptive Resonance Theory for Online Active Semi-supervised Learning

by   Taehyeong Kim, et al.

Active learning is widely used to reduce labeling effort and training time by repeatedly querying only the most beneficial samples from unlabeled data. In real-world problems where data cannot be stored indefinitely due to limited storage or privacy issues, the query selection and the model update should be performed as soon as a new data sample is observed. Various online active learning methods have been studied to deal with these challenges; however, there are difficulties in selecting representative query samples and updating the model efficiently. In this study, we propose Message Passing Adaptive Resonance Theory (MPART) for online active semi-supervised learning. The proposed model learns the distribution and topology of the input data online. It then infers the class of unlabeled data and selects informative and representative samples through message passing between nodes on the topological graph. MPART queries the beneficial samples on-the-fly in stream-based selective sampling scenarios, and continuously improve the classification model using both labeled and unlabeled data. We evaluate our model on visual (MNIST, SVHN, CIFAR-10) and audio (NSynth) datasets with comparable query selection strategies and frequencies, showing that MPART significantly outperforms the competitive models in online active learning environments.


Exploiting Diversity of Unlabeled Data for Label-Efficient Semi-Supervised Active Learning

The availability of large labeled datasets is the key component for the ...

Large deviations for the perceptron model and consequences for active learning

Active learning is a branch of machine learning that deals with problems...

Combining MixMatch and Active Learning for Better Accuracy with Fewer Labels

We propose using active learning based techniques to further improve the...

RIM: Reliable Influence-based Active Learning on Graphs

Message passing is the core of most graph models such as Graph Convoluti...

Online Active Model Selection for Pre-trained Classifiers

Given k pre-trained classifiers and a stream of unlabeled data examples,...

Sampling Bias in Deep Active Classification: An Empirical Study

The exploding cost and time needed for data labeling and model training ...

Online Active Learning with Dynamic Marginal Gain Thresholding

The blessing of ubiquitous data also comes with a curse: the communicati...