Sequential Maximum Margin Classifiers for Partially Labeled Data

03/07/2018
by   Elizabeth Hou, et al.
0

In many real-world applications, data is not collected as one batch, but sequentially over time, and often it is not possible or desirable to wait until the data is completely gathered before analyzing it. Thus, we propose a framework to sequentially update a maximum margin classifier by taking advantage of the Maximum Entropy Discrimination principle. Our maximum margin classifier allows for a kernel representation to represent large numbers of features and can also be regularized with respect to a smooth sub-manifold, allowing it to incorporate unlabeled observations. We compare the performance of our classifier to its non-sequential equivalents in both simulated and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2016

Robust training on approximated minimal-entropy set

In this paper, we propose a general framework to learn a robust large-ma...
research
02/16/2017

Latent Laplacian Maximum Entropy Discrimination for Detection of High-Utility Anomalies

Data-driven anomaly detection methods suffer from the drawback of detect...
research
07/26/2021

Transductive Maximum Margin Classifier for Few-Shot Learning

Few-shot learning aims to train a classifier that can generalize well wh...
research
03/29/2012

Corrected Kriging update formulae for batch-sequential data assimilation

Recently, a lot of effort has been paid to the efficient computation of ...
research
12/07/2022

Tight bounds for maximum ℓ_1-margin classifiers

Popular iterative algorithms such as boosting methods and coordinate des...
research
07/29/2014

Estimating the Accuracies of Multiple Classifiers Without Labeled Data

In various situations one is given only the predictions of multiple clas...
research
01/20/2017

Stability Enhanced Large-Margin Classifier Selection

Stability is an important aspect of a classification procedure because u...

Please sign up or login with your details

Forgot password? Click here to reset