Description of a Tracking Metric Inspired by KL-divergence

05/09/2018
by   Terrence Adams, et al.
0

A unified metric is given for the evaluation of tracking systems. The metric is inspired by KL-divergence or relative entropy, which is commonly used to evaluate clustering techniques. Since tracking problems are fundamentally different from clustering, the components of KL-divergence are recast to handle various types of tracking errors (i.e., false alarms, missed detections, merges, splits). Preliminary scoring results are given on a standard tracking dataset (Oxford Town Centre Dataset). In the final section, prospective advantages of the metric are listed, along with ideas for improving the metric. We end with a couple of open questions concerning tracking metrics.

READ FULL TEXT
research
08/24/2019

Relation between the Kantorovich-Wasserstein metric and the Kullback-Leibler divergence

We discuss a relation between the Kantorovich-Wasserstein (KW) metric an...
research
11/11/2021

Causal KL: Evaluating Causal Discovery

The two most commonly used criteria for assessing causal model discovery...
research
10/20/2021

CIM-PPO:Proximal Policy Optimization with Liu-Correntropy Induced Metric

As an algorithm based on deep reinforcement learning, Proximal Policy Op...
research
11/04/2020

Independent Gaussian Distributions Minimize the Kullback-Leibler (KL) Divergence from Independent Gaussian Distributions

This short note is on a property of the Kullback-Leibler (KL) divergence...
research
05/13/2021

Empirical Evaluation of Biased Methods for Alpha Divergence Minimization

In this paper we empirically evaluate biased methods for alpha-divergenc...
research
03/15/2021

Nonequilibrium in Thermodynamic Formalism: the Second Law, gases and Information Geometry

In Nonequilibrium Thermodynamics and Information Theory, the relative en...

Please sign up or login with your details

Forgot password? Click here to reset