Uncertainty quantification for multiclass data description

08/29/2021
by   Leila Kalantari, et al.
0

In this manuscript, we propose a multiclass data description model based on kernel Mahalanobis distance (MDD-KM) with self-adapting hyperparameter setting. MDD-KM provides uncertainty quantification and can be deployed to build classification systems for the realistic scenario where out-of-distribution (OOD) samples are present among the test data. Given a test signal, a quantity related to empirical kernel Mahalanobis distance between the signal and each of the training classes is computed. Since these quantities correspond to the same reproducing kernel Hilbert space, they are commensurable and hence can be readily treated as classification scores without further application of fusion techniques. To set kernel parameters, we exploit the fact that predictive variance according to a Gaussian process (GP) is empirical kernel Mahalanobis distance when a centralized kernel is used, and propose to use GP's negative likelihood function as the cost function. We conduct experiments on the real problem of avian note classification. We report a prototypical classification system based on a hierarchical linear dynamical system with MDD-KM as a component. Our classification system does not require sound event detection as a preprocessing step, and is able to find instances of training avian notes with varying length among OOD samples (corresponding to unknown notes of disinterest) in the test audio clip. Domain knowledge is leveraged to make crisp decisions from raw classification scores. We demonstrate the superior performance of MDD-KM over possibilistic K-nearest neighbor.

READ FULL TEXT
research
03/02/2021

A Kernel Framework to Quantify a Model's Local Predictive Uncertainty under Data Distributional Shifts

Traditional Bayesian approaches for model uncertainty quantification rel...
research
06/07/2021

BayesIMP: Uncertainty Quantification for Causal Data Fusion

While causal models are becoming one of the mainstays of machine learnin...
research
05/26/2023

Vecchia Gaussian Process Ensembles on Internal Representations of Deep Neural Networks

For regression tasks, standard Gaussian processes (GPs) provide natural ...
research
02/27/2022

Hierarchical Linear Dynamical System for Representing Notes from Recorded Audio

We seek to develop simultaneous segmentation and classification of notes...
research
02/12/2021

Two-sample Test with Kernel Projected Wasserstein Distance

We develop a kernel projected Wasserstein distance for the two-sample te...
research
09/22/2021

Quantifying Model Predictive Uncertainty with Perturbation Theory

We propose a framework for predictive uncertainty quantification of a ne...
research
12/09/2020

KNN Classification with One-step Computation

KNN classification is a query triggered yet improvisational learning mod...

Please sign up or login with your details

Forgot password? Click here to reset