Wald-Kernel: Learning to Aggregate Information for Sequential Inference

08/31/2015
by   Diyan Teng, et al.
0

Sequential hypothesis testing is a desirable decision making strategy in any time sensitive scenario. Compared with fixed sample-size testing, sequential testing is capable of achieving identical probability of error requirements using less samples in average. For a binary detection problem, it is well known that for known density functions accumulating the likelihood ratio statistics is time optimal under a fixed error rate constraint. This paper considers the problem of learning a binary sequential detector from training samples when density functions are unavailable. We formulate the problem as a constrained likelihood ratio estimation which can be solved efficiently through convex optimization by imposing Reproducing Kernel Hilbert Space (RKHS) structure on the log-likelihood ratio function. In addition, we provide a computationally efficient approximated solution for large scale data set. The proposed algorithm, namely Wald-Kernel, is tested on a synthetic data set and two real world data sets, together with previous approaches for likelihood ratio estimation. Our empirical results show that the classifier trained through the proposed technique achieves smaller average sampling cost than previous approaches proposed in the literature for the same error rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2017

Kernel Two-Sample Hypothesis Testing Using Kernel Set Classification

The two-sample hypothesis testing problem is studied for the challenging...
research
07/30/2023

Adaptive learning of density ratios in RKHS

Estimating the ratio of two probability densities from finitely many obs...
research
06/23/2022

Universal Neyman-Pearson Classification with a Known Hypothesis

We propose a universal classifier for binary Neyman-Pearson classificati...
research
06/10/2020

Deep Neural Networks for the Sequential Probability Ratio Test on Non-i.i.d. Data Series

Classifying sequential data as early as and as accurately as possible is...
research
01/28/2019

An analytic formulation for positive-unlabeled learning via weighted integral probability metric

We consider the problem of learning a binary classifier from only positi...
research
11/02/2022

Likelihood-free hypothesis testing

Consider the problem of testing Z ∼ℙ^⊗ m vs Z ∼ℚ^⊗ m from m samples. Gen...
research
11/01/2019

Training Neural Networks for Likelihood/Density Ratio Estimation

Various problems in Engineering and Statistics require the computation o...

Please sign up or login with your details

Forgot password? Click here to reset