DeepAI AI Chat
Log In Sign Up

Composite Goodness-of-fit Tests with Kernels

by   Oscar Key, et al.
Universidad Adolfo Ibáñez

Model misspecification can create significant challenges for the implementation of probabilistic models, and this has led to development of a range of inference methods which directly account for this issue. However, whether these more involved methods are required will depend on whether the model is really misspecified, and there is a lack of generally applicable methods to answer this question. One set of tools which can help are goodness-of-fit tests, where we test whether a dataset could have been generated by a fixed distribution. Kernel-based tests have been developed to for this problem, and these are popular due to their flexibility, strong theoretical guarantees and ease of implementation in a wide range of scenarios. In this paper, we extend this line of work to the more challenging composite goodness-of-fit problem, where we are instead interested in whether the data comes from any distribution in some parametric family. This is equivalent to testing whether a parametric model is well-specified for the data.


page 1

page 2

page 3

page 4


Bahadur efficiency for certain goodness–of–fit tests based on the empirical characteristic function

We study the Bahadur efficiency of several weighted L2–type goodness–of–...

Parametric PDF for Goodness of Fit

The goodness of fit methods for classification problems relies tradition...

KSD Aggregated Goodness-of-fit Test

We investigate properties of goodness-of-fit tests based on the Kernel S...

Is the Gompertz family a good fit to your data?

That data follow a Gompertz distribution is a widely used assumption in ...

A general method for goodness-of-fit tests for arbitrary multivariate models

Goodness-of-fit tests are often used in data analysis to test the agreem...

Smoothed inference and graphics via LP modeling

Classical tests of goodness-of-fit aim to validate the conformity of a p...