Covariance-engaged Classification of Sets via Linear Programming

06/26/2020
by   Zhao Ren, et al.
0

Set classification aims to classify a set of observations as a whole, as opposed to classifying individual observations separately. To formally understand the unfamiliar concept of binary set classification, we first investigate the optimal decision rule under the normal distribution, which utilizes the empirical covariance of the set to be classified. We show that the number of observations in the set plays a critical role in bounding the Bayes risk. Under this framework, we further propose new methods of set classification. For the case where only a few parameters of the model drive the difference between two classes, we propose a computationally-efficient approach to parameter estimation using linear programming, leading to the Covariance-engaged LInear Programming Set (CLIPS) classifier. Its theoretical properties are investigated for both independent case and various (short-range and long-range dependent) time series structures among observations within each set. The convergence rates of estimation errors and risk of the CLIPS classifier are established to show that having multiple observations in a set leads to faster convergence rates, compared to the standard classification situation in which there is only one observation in the set. The applicable domains in which the CLIPS performs better than competitors are highlighted in a comprehensive simulation study. Finally, we illustrate the usefulness of the proposed methods in classification of real image data in histopathology.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2014

Estimation of Large Covariance and Precision Matrices from Temporally Dependent Observations

We consider the estimation of large covariance and precision matrices fr...
research
02/12/2018

Sparse and Robust Reject Option Classifier using Successive Linear Programming

In this paper, we propose a new sparse and robust reject option classifi...
research
10/20/2022

Theoretical analysis of deep neural networks for temporally dependent observations

Deep neural networks are powerful tools to model observations over time ...
research
12/22/2020

Estimation in nonparametric regression model with additive and multiplicative noise via Laguerre series

We look into the nonparametric regression estimation with additive and m...
research
01/11/2015

Identifiability and optimal rates of convergence for parameters of multiple types in finite mixtures

This paper studies identifiability and convergence behaviors for paramet...
research
05/26/2020

Class-Weighted Classification: Trade-offs and Robust Approaches

We address imbalanced classification, the problem in which a label may h...
research
01/09/2017

On Reject and Refine Options in Multicategory Classification

In many real applications of statistical learning, a decision made from ...

Please sign up or login with your details

Forgot password? Click here to reset