Maximum information divergence from linear and toric models

08/29/2023
by   Yulia Alexandr, et al.
0

We study the problem of maximizing information divergence from a new perspective using logarithmic Voronoi polytopes. We show that for linear models, the maximum is always achieved at the boundary of the probability simplex. For toric models, we present an algorithm that combines the combinatorics of the chamber complex with numerical algebraic geometry. We pay special attention to reducible models and models of maximum likelihood degree one.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2020

Logarithmic Voronoi cells

We study Voronoi cells in the statistical setting by considering preimag...
research
11/17/2020

Maximum Likelihood Estimation for Nets of Conics

We study the problem of maximum likelihood estimation for 3-dimensional ...
research
09/24/2020

Reciprocal Maximum Likelihood Degrees of Brownian Motion Tree Models

We give an explicit formula for the reciprocal maximum likelihood degree...
research
08/08/2023

On the concentration of the maximum degree in the duplication-divergence models

We present a rigorous and precise analysis of the maximum degree and the...
research
11/21/2018

Spread Divergences

For distributions p and q with different support, the divergence general...
research
05/19/2022

Classifying one-dimensional discrete models with maximum likelihood degree one

We propose a classification of all one-dimensional discrete statistical ...
research
12/29/2021

Logarithmic Voronoi polytopes for discrete linear models

We study logarithmic Voronoi cells for linear statistical models and par...

Please sign up or login with your details

Forgot password? Click here to reset