
Firth's logistic regression with rare events: accurate effect estimates AND predictions?
Firthtype logistic regression has become a standard approach for the an...
read it

The Impact of Regularization on Highdimensional Logistic Regression
Logistic regression is commonly used for modeling dichotomous outcomes. ...
read it

Stochastic Tverberg theorems and their applications in multiclass logistic regression, data separability, and centerpoints of data
We present new stochastic geometry theorems that give bounds on the prob...
read it

Tuning in ridge logistic regression to solve separation
Separation in logistic regression is a common problem causing failure of...
read it

Bias Reduction as a Remedy to the Consequences of Infinite Estimates in Poisson and Tobit Regression
Data separation is a wellstudied phenomenon that can cause problems in ...
read it

Takeuchi's Information Criteria as a form of Regularization
Takeuchi's Information Criteria (TIC) is a linearization of maximum like...
read it

Neural computation from first principles: Using the maximum entropy method to obtain an optimal bitsperjoule neuron
Optimization results are one method for understanding neural computation...
read it
A Maximum Entropy Procedure to Solve Likelihood Equations
In this article we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy approach. Unlike standard procedures that require equating at zero the score function of the maximumlikelihood problem, we propose an alternative strategy where the score is instead used as external informative constraint to the maximization of the convex Shannon's entropy function. The problem involves the reparameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, this latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy reformulation of the score problem solves the likelihood equation problem. Similarly, when maximumlikelihood estimation is difficult, as for the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth's Biascorrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation.
READ FULL TEXT
Comments
There are no comments yet.