# A Maximum Entropy Procedure to Solve Likelihood Equations

In this article we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy approach. Unlike standard procedures that require equating at zero the score function of the maximum-likelihood problem, we propose an alternative strategy where the score is instead used as external informative constraint to the maximization of the convex Shannon's entropy function. The problem involves the re-parameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, this latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy re-formulation of the score problem solves the likelihood equation problem. Similarly, when maximum-likelihood estimation is difficult, as for the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth's Bias-corrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation.

## Authors

• 8 publications
• 9 publications
• 3 publications
• 3 publications
• ### Firth's logistic regression with rare events: accurate effect estimates AND predictions?

Firth-type logistic regression has become a standard approach for the an...
01/19/2021 ∙ by Rainer Puhr, et al. ∙ 0

• ### The Impact of Regularization on High-dimensional Logistic Regression

Logistic regression is commonly used for modeling dichotomous outcomes. ...
06/10/2019 ∙ by Fariborz Salehi, et al. ∙ 4

• ### Stochastic Tverberg theorems and their applications in multi-class logistic regression, data separability, and centerpoints of data

We present new stochastic geometry theorems that give bounds on the prob...
07/23/2019 ∙ by Jesús A. De Loera, et al. ∙ 0

• ### Tuning in ridge logistic regression to solve separation

Separation in logistic regression is a common problem causing failure of...
11/30/2020 ∙ by Hana Šinkovec, et al. ∙ 0

• ### Bias Reduction as a Remedy to the Consequences of Infinite Estimates in Poisson and Tobit Regression

Data separation is a well-studied phenomenon that can cause problems in ...
01/18/2021 ∙ by Susanne Köll, et al. ∙ 0

• ### Takeuchi's Information Criteria as a form of Regularization

Takeuchi's Information Criteria (TIC) is a linearization of maximum like...
03/13/2018 ∙ by Matthew Dixon, et al. ∙ 0