Jensen-Shannon Divergence as a Goodness-of-Fit Measure for Maximum Likelihood Estimation and Curve Fitting

09/28/2018
by   Mark Levene, et al.
0

The coefficient of determination, known as R^2, is commonly used as a goodness-of-fit criterion for fitting linear models. R^2 is somewhat controversial when fitting nonlinear models, although it may be generalised on a case-by-case basis to deal with specific models such as the logistic model. Assume we are fitting a parametric distribution to a data set using, say, the maximum likelihood estimation method. A general approach to measure the goodness-of-fit of the fitted parameters, which we advocate herein, is to use a nonparametric measure for model comparison between the raw data and the fitted model. In particular, for this purpose we put forward the Jensen-Shannon divergence (JSD) as a metric, which is bounded and has an intuitive information-theoretic interpretation. We demonstrate, via a straightforward procedure making use of the JSD, that it can be used as part of maximum likelihood estimation or curve fitting as a measure of goodness-of-fit, including the construction of a confidence interval for the fitted parametric distribution. We also propose that the JSD can be used more generally in nonparametric hypothesis testing for model selection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2022

Maximum likelihood estimation for left-truncated log-logistic distributions with a given truncation point

The maximum likelihood estimation of the left-truncated log-logistic dis...
research
06/08/2022

A simple data-driven method to optimise the penalty strengths of penalised models and its application to non-parametric smoothing

Information of interest can often only be extracted from data by model f...
research
03/12/2019

Fitting Heterogeneous Lanchester Models on the Kursk Campaign

The battle of Kursk between Soviet and German is known to be the biggest...
research
11/24/2022

Estimating Regression Predictive Distributions with Sample Networks

Estimating the uncertainty in deep neural network predictions is crucial...
research
08/30/2023

Understanding step selection analysis through numerical integration

Step selection functions (SSFs) are flexible models to jointly describe ...
research
09/13/2016

Information Theoretic Structure Learning with Confidence

Information theoretic measures (e.g. the Kullback Liebler divergence and...
research
06/01/2018

Fitting a deeply-nested hierarchical model to a large book review dataset using a moment-based estimator

We consider a particular instance of a common problem in recommender sys...

Please sign up or login with your details

Forgot password? Click here to reset