 # Study on estimators of the PDF and CDF of the one parameter polynomial exponential distribution

In this article, we have considered one parameter polynomial exponential (OPPE) distribution. The exponential, Lindley, length-biased Lindley and Sujatha distribution are particular cases. Two estimators viz, MLE and UMVUE of the PDF and the CDF of the OPPE distribution have been discussed. The estimation issue of the length-biased Lindley and Sujatha distribution have been considered in detail. The estimators have been compared in MSE sense. Monte Carlo simulations and real data analysis are performed to compare the performances of the proposed methods of estimation.

## Authors

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

Statistical models are very useful in describing and predicting real-world phenomena. Recent developments focus on defining new families that extend well-known distribution and at the same time providing greater flexibility in modelling data in practice. Many well known lifetime distributions for modelling lifetime data such as exponential, Gamma, Weibull, etc. have been extensively studied.

Let

is a random variable taking values

. So the distribution of

may be absolutely continuous or discrete. The probability density function (PDF) and the cumulative distribution function (CDF) of the Lindley distribution [see, Lindley (1958)] are given by

 f(x)=θ21+θ(1+x)e−θx, x, θ>0 (1.1)

and

 F(x)=1−(1+θ+θx)1+θe−θx, x, θ>0, (1.2)

respectively.

The exponential distribution is closed in form to the Lindley distribution given in . The PDF and the CDF of the exponential distribution are given by

 f(x) = θe−θx, x, θ>0 (1.3)

and

 F(x) = 1−e−θx, x, θ>0, respectively. (1.4)

Many of the mathematical properties (e.g., the mode of the distribution, moments, skewness and kurtosis measures, cumulants, failure rate and mean residual life, mean deviation, entropies, etc.) are more flexible than those of the exponential distribution. For the exponential distribution, some of the properties are constant, usually not appropriate assumptions in reality whereas for the Lindley distribution there is scope to vary [see, Ghitany et al. (2008)].

The Lindley distribution is one way to describe the lifetime of a process or device. It can be used in a wide variety of fields, including biology, engineering and medicine. Ghitany et al. (2008) fitted this distribution to the waiting times for getting service of bank customers data. Ghitany et al. (2011) stated that it is especially useful for modeling in mortality studies. Mazucheli and Achcar (2011) discussed the applications of the Lindley distribution to competing risk lifetime data. Mukherjee and Maiti (2014) has used this distribution for constructing an acceptance sampling plan for the variable. Maiti et al. (2014) applied it in the context of describing a new process capability index.

It has been generalized by a host of authors. To mention a few, Zakerzadeh and Dolati (2009), Bakouch et al. (2012), Shanker and Ghebretsadik (2013), Elbatal et al. (2013), Ghitany et al. (2013) among others. Bouchahed and Zeghdoudi (2018) has proposed a new and unified approach in generalizing the Lindley’s distribution. They investigated some structural properties like moments, skewness, kurtosis, median, mean deviations, Lorenz curve, entropies and limiting distribution of extreme order statistics; reliability properties like reliability function, hazard rate, stress-strength reliability, stochastic ordering; and estimation methods like the method of moment and maximum likelihood. We call the proposed distribution of Bouchahed and Zeghdoudi (2018) as the one parameter polynomial exponential (OPPE) distribution.

The PDF of a random variable of the OPPE distribution can be written as

 f(x)=h(θ)p(x)e−θx, x, θ>0, (1.5)

where, ,

The distribution can also be written as

 f(x) = h(θ)r∑k=0akxke−θx (1.6) = ∑rk=0akΓ(k+1)θk+1fGA(x;k+1,θ)∑rk=0akΓ(k+1)θk+1,

where

is the PDF of a gamma distribution with shape parameter

and scale parameter and ’s are non-negative constants. The distribution is a finite mixture of gamma distributions.

The CDF is given by

 F(x)=1−⎛⎜ ⎜⎝∑rk=0akΓ(k+1)Γ(k+1,θx)θk+1∑rk=0akk!θk+1⎞⎟ ⎟⎠, x, θ>0, (1.7)

where .

Some special cases are as follows:

1. gives the exponential distribution,

2. gives the Lindley distribution,

3. gives the Akash distribution [c.f. Shanker (2015)],

4. gives the Aradhana distribution [c.f. Shanker (2016b)],

5. gives the Sujatha distribution [c.f. Shanker (2016e)],

6. gives the length-biased Lindley distribution [c.f. Ayesha (2017)],

7. gives the Amarendra distribution [c.f. Shanker (2016a)],

8. gives the Devya distribution [c.f. Shanker (2016c)],

9. gives the Shambhu distribution [c.f. Shanker (2016d)].

Statisticians are most of the times interested about inferring the parameter(s) involved in the distribution. Maximum likelihood estimator (MLE) and Bayes estimate of the parameter has been focused by the authors. Hardly any unbiased estimator of the parameter has been studied so far and finding out uniformly minimum variance unbiased estimator (UMVUE) of the parameter seems to be intractable and consequently the comparison with any unbiased class of estimators is not being made. However, instead of studying the estimators of the parameter(s), we have scope to find out unbiased estimator of the PDF and the CDF as well as biased estimator of the same and comparison between the estimators could be made. That is why we have shifted our focus from estimation of parameter(s) to estimation of the PDF and the CDF.

The estimators for the PDF, CDF or both can be used to estimate various functions, like differential entropy, Rényi entropy, Kullback-Leibler divergence, Fisher information, reliability function, cumulative residual entropy, the quantile function, Bonferroni curve, Lorenz curve, probability weighted moments, hazard rate function, mean deviation about mean etc.

There are few works available relating to estimation of the PDF and the CDF of different probability distributions. References include; Asrabadi (1990), Dixit and Jabbari (2010) and Dixit and Jabbari (2011) - Pareto distribution; Alizadeh et al. (2015), Mukherjee et al. (2016) - generalized exponential distribution; Bagheri et al. (2014) - generalized exponential-Poisson distribution; Bagheri et al. (2016b) - Weibull extension model; Bagheri et al. (2016a) - exponentiated Gumbell distribution; Jabbari and Jabbari (2010) - exponentiated Pareto distribution; Maiti and Mukherjee (2018) - Lindley distribution; Tripathi et al. (2017b) - Generalized Logistic distribution; Tripathi et al. (2017a) - exponentiated moment exponential distribution; Mukherjee and Maiti (2019) - lognormal distribution.

Organization of the article is as follows. Section 2 deals with MLE of the PDF and the CDF of the OPPE distribution. Section 3 is devoted to finding out the UMVUE of the PDF and the CDF and their MSEs (in this case the variances). Particular case like length-biased Lindley and Sujatha distribution have been discussed in section 4. In section 5, simulation study results are reported and comparisons are made. Real-life data sets have been analyzed in section 6. In section 7, concluding remarks are made based on the findings of this article.

## 2 MLE of the PDF and the CDF

Let be random sample of size drawn from the PDF in . Here we try to find the MLE of which is denoted as . The log-likelihood of is given by

 l(θ) = lnL(θ|X) = nlnh(θ)+n∑i=1lnp(Xi)−θn∑i=1Xi

Now,

 dl(θ)dθ = 0 i.e. nddθ(lnh(θ))−n∑i=1Xi = 0 i.e. ∑rk=0akΓ(k+2)θk+2∑rk=0akΓ(k+1)θk+1−¯¯¯¯¯X = 0 (2.8)

Since, the MLE of is not of a closed form expression, we have to solve (2) numerically to obtain the MLE of . Theoretical expressions for the MSE of the MLEs are not available. MSE will be studied through simulation.

## 3 UMVUE of the PDF and the CDF

In this section, we obtain the UMVUE of the PDF and the CDF of the OPPE distribution. Also, we obtain the MSEs of these estimators.

To derive the UMVUE of the PDF and the CDF (Theorem 3.2), we will use Theorem 3.1 and Lemma 3.1.

###### Theorem 3.1.

Let . Then the distribution of is

 f(t) = hn(θ)∑q0∑q1…∑qrc(n,q0,q1,…,qr)exp(−θt) ×t∑rk=0(k+1)qk−1, t>0

with and .

###### Proof.

The mgf of is

 MT(t) = hn(θ)[r∑k=0akΓ(k+1)θk+1(1−tθ)−(k+1)]n = hn(θ)[a0Γ(1)θ(1−tθ)−1+....+arΓ(r+1)θr+1 ×(1−tθ)−(r+1)]n = hn(θ)∑q0∑q1…∑qrn!q0!q1!…qr!r∏k=0(akΓ(k+1))qk ×θ−∑rk=0(k+1)qk(1−tθ)−∑rk=0(k+1)qk.

Hence, the distribution of is

 f(t) = hn(θ)∑q0∑q1…∑qrn!q0!q1!…qr!r∏k=0(akΓ(k+1))qk ×θ−∑rk=0(k+1)qkfGA(t,r∑k=0(k+1)qk,θ) = hn(θ)∑q0∑q1…∑qrc(n,q0,q1....,qr)t∑rk=0(k+1)qk−1exp(−θt),

where . ∎

###### Lemma 3.1.

The conditional distribution of given is

 fX1|T(x|t) = p(x)An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr) ×(t−x)∑rk=0(k+1)yk−1, 0

where

 An(t)=∑q0∑q1…∑qrc(n, q0, q1, …, qr)t∑rk=0(k+1)qk−1

and

 c(n−1,y0,y1....,yr)=(n−1)!y0!y1!....yr!r∏k=0(akΓ(k+1))yk1Γ(∑rk=0(k+1)yk),

with

###### Proof.
 fX1|T(x|t) = fX1(x)f(t−x)f(t) = p(x)An(t)∑y0∑y1…∑yrc(n−1,y0,y1,…,yr) ×(t−x)∑rk=0(k+1)yk−1.

###### Theorem 3.2.

Let be given. Then

 ˆf(x) = p(x)An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr) (3.9) ×(t−x)∑rk=0(k+1)yk−1, 0 < x < t,

is UMVUE for and

 ˆF(x) = 1−1An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr)t∑rk=0(k+1)yk (3.10) ×r∑k=0aktkIx/t((k+1),r∑k=0(k+1)yk) , 0 < x < t,

is UMVUE for , where is an incomplete beta function and .

###### Proof.

From Lemma 3.1 we get UMVUE of the PDF.

 ˆF(x) = 1−∫txˆf(m)dm = 1−∫txp(m)An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr) ×(t−m)∑rk=0(k+1)yk−1 = 1−1An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr)t∑rk=0(k+1)yk ×r∑k=0aktkIx/t((k+1),r∑k=0(k+1)yk)

The MSE of is given by

 MSE(ˆf(x)) = E(ˆf2(x))−f2(x) (3.11) = ∫∞x[p(x)An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr) ×(t−x)∑rk=0(k+1)yk−1]2 ×f(t)dt−f2(x)

Using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the PDF. And the MSE of is given by

 MSE(ˆF(x)) = E(ˆF2(x))−F2(x) (3.12) = ∫∞x[1−1An(t)∑y0∑y1...∑yrc(n−1,y0,y1,…,yr) ×t∑rk=0(k+1)yk ×r∑k=0aktkIx/t((k+1),r∑k=0(k+1)yk)]2 ×f(t)dt−F2(x)

Similarly, using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the CDF.

## 4 Particular case

In this section, we have studied in detail of the length-biased Lindley and Sujatha distribution that are particular cases of the OPPE distribution. The estimators of the PDF and the CDF are explicitly written, and their MSEs are compared. Another particular case of the OPPE distribution is the Lindley distribution. The estimation of the PDF and the CDF of the Lindley distribution has been studied in detail in Maiti and Mukherjee (2018).

### 4.1 Length-biased Lindley distribution

Substituting in and , we can have the PDF and the CDF of the length-biased Lindley distribution respectively.
The PDF is

 f(x)=θ32+θx(1+x)e−θx;x,θ>0 (4.13)

and the CDF is

 F(x)=1−[1+θx2+θ(2+θx+θ)]e−θx;x,θ>0. (4.14)

#### 4.1.1 MLE of the PDF and the CDF

If we substitute in we will get the given expression as

 2θ+6θ(θ+2)−¯X=0.

Solving numerically the above mentioned equation, we can get the MLE of the length-biased Lindley distribution.

#### 4.1.2 UMVUE of the PDF and the CDF

In this section, we obtain the UMVUE of the PDF and the CDF of the length-biased Lindley distribution. Also, we obtain the MSEs of these estimators.

To derive the UMVUE of the PDF and the CDF, we substitute in Theorem .

###### Theorem 4.1.

Let be given. Then the UMVUE of the PDF is

 ˆf(x) = p(x)An(t)∑y1∑y2c(n−1,y1,y2) ×(t−x)2y1+3y2−1, 0 < x < t,

and the UMVUE of the CDF is

 ˆF(x) = 1−1An(t)∑y1∑y2c(n−1,y1,y2)t(2y1+3y2) ×[tIx/t(2,2y1+3y2)+t2Ix/t(3,2y1+3y2)], 0 < x < t.

The MSE of the UMVUE of the PDF is given by

 MSE(ˆf(x)) = E(ˆf2(x))−f2(x) (4.15) = ∫∞x[p(x)An(t)∑y1∑y2c(n−1,y1,y2) ×(t−x)2y1+3y2−1]2 ×f(t)dt−f2(x)

Using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the PDF.

And the MSE of the UMVUE of the CDF is given by

 MSE(ˆF(x)) = E(ˆF2(x))−F2(x) (4.16) = ∫∞x[1−1An(t)∑y1∑y2c(n−1,y1,y2)t(2y1+3y2) ×[tIx/t(2,2y1+3y2)+t2Ix/t(3,2y1+3y2)]]2 ×f(t)dt−F2(x)

Similarly, using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the CDF.

Theoretical graph of the MSE of the UMVUE of the length-biased Lindley distribution is presented in Figure 1. Figure 1: Graph of theoretical MSE of UMVUE of the PDF and the CDF of the length-biased Lindley distribution for θ=0.1, x=2 and r=2.

### 4.2 Sujatha distribution

By substituting in and , we will get the PDF and the CDF of the Sujatha Distribution respectively.
The PDF is

 f(x)=θ32+θ+θ2(1+x+x2)e−θx;x,θ>0 (4.17)

and the CDF is

 F(x)=1−[1+θx2+θ+θ2(2+θx+θ)]e−θx;x,θ>0. (4.18)

#### 4.2.1 MLE of the PDF and the CDF

If we substitute in we will get the given expression as

 θ2+2θ+6θ(θ2+θ+2)−¯X=0.

Solving numerically the above mentioned equation, we can get the MLE of the Sujatha distribution.

#### 4.2.2 UMVUE of the PDF and the CDF

In this section, we obtain the UMVUE of the PDF and the CDF of the Sujatha distribution. Also, we obtain the MSEs of these estimators.

To derive the UMVUE of the PDF and the CDF, we use Theorem 3 and replace .

###### Theorem 4.2.

Let be given. Then the UMVUE of the PDF is

 ˆf(x) = p(x)An(t)∑y0∑y1∑y2c(n−1,y0,y1,y2) ×(t−x)y0+2y1+3y2−1, 0 < x < t,

and the UMVUE of the CDF is

 ˆF(x) = 1−1An(t)∑y0∑y1∑y2c(n−1,y0,y1,y2)t(y0+2y1+3y2) ×[Ix/t(1,y0+2y1+3y2)+tIx/t(2,y0+2y1+3y2)+t2Ix/t(3,y0+2y1+3y2)], 0 < x < t.

The MSE of the UMVUE of the PDF is given by

 MSE(ˆf(x)) = E(ˆf2(x))−f2(x) (4.19) = ∫∞x[p(x)An(t)∑y0∑y1∑y2c(n−1,y0,y1,y2) ×(t−x)y0+2y1+3y2−1]2 ×f(t)dt−f2(x)

Using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the PDF.

And the MSE of the UMVUE of the CDF is given by

 MSE(ˆF(x)) = E(ˆF2(x))−F2(x) (4.20) = ∫∞x[1−1An(t)∑y0∑y1∑y2c(n−1,y0,y1,y2)t(y0+2y1+3y2) ×[Ix/t(1,y0+2y1+3y2)+tIx/t(2,y0+2y1+3y2)+t2Ix/t(3,y0+2y1+3y2)]]2 ×f(t)dt−F2(x)

Similarly, using Theorem 3.1, in , we can get the value of the MSE of UMVUE of the CDF.

Theoretical graph of the MSE of the UMVUE of the Sujatha distribution is presented in Figure 2. Figure 2: Graph of theoretical MSE of UMVUE of the PDF and the CDF of the Sujatha distribution for θ=0.1, x=2 and r=2.

## 5 Simulation

Direct application of Monte Carlo Simulation technique fails for generating random samples from the unified generalized Lindley, since the equation

 F(x)=u, u∈(0,1)

cannot be explicitly solved in . On the other hand, one can use the fact that the distribution is a mixture of gamma distributions given in .

For the OPPE distribution the generation of random sample is distributed in the following algorithm:

1. Generate

2. If ,then set ,
where  and if , then set , where

A simulation is carried out with repetitions. We choose , and for both distribution. We compute MSE of the MLE and UMVUE of the PDF and the CDF. From Figures 3 and 4, it is clear that MSE decreases with increasing sample size that shows the consistency property of the estimators. Figure 3: Graph of simulated MSE of the MLE of the PDF and the CDF of the length-biased Lindley distribution for θ=0.1, x=2 and r=2. Figure 4: Graph of simulated MSE of the MLE of the PDF and the CDF of the Sujatha distribution for θ=0.1, x=2 and r=2.

## 6 Data Analysis

In this section, we provide the analysis of two real data sets for comparing the performances of MLE and UMVUE for the PDF and the CDF. Table 1 represents the survival times (in days) of 72 guinea pigs infected with virulent tubercle bacilli, observed and reported by Elbatal et al. (2013). Table 2 represents the failure times of the air conditioning system of an airplane and it is obtained from Linhart and Zucchini (1986).

We fit length-biased Lindley distribution in data set-I (Table 1). The corresponding graph of histogram and the estimated PDF and the CDF has been shown in Figure 5. From Figure 5, we observe that the length-biased Lindley distribution shows good fit for data set-I.

Sujatha distribution have been fitted to data set-II (Table 2). Here, for computational ease, we have divided the whole data set by . The graph of histogram and the estimated PDF and the CDF has been shown in Figure 6. From Figure 6, we observe that the Sujatha distribution shows good fit for data set-II.

Table 3 and 4 give the estimate of the negative log-likelihood values. Lower the value of negative log-likelihood indicates the better fit. From table 3 and 4, we see that the UMVUE and MLE is better in a negative log-likelihood sense, respectively. Figure 5: Graph of the estimated PDF and the CDF of length-biased Lindley distribution fitted to the data set-I. Figure 6: Graph of the estimated PDF and the CDF of Sujatha distribution fitted to the data set-II.

## 7 Concluding Remarks

Two estimators - MLE and UMVUE, has been found out for the PDF and the CDF of the length-biased Lindley and Sujatha distribution. The estimators have been compared theoretically as well as through simulation study in MSE sense. UMVUE is better for the PDF and the CDF in MSE sense for the length-biased Lindley and Sujatha distribution.

## References

•  Alizadeh, M., Razezaei, S., Bagheri, S. F., and Nadarajah, S. (2015). Efficient estimation for the generalized exponential distribution, Statistical Papers, 54(4), 1015-1031.
•  Asrabadi, B. R. (1990). Estimation in the Pareto distribution, Metrika, 37, 199-205.
•  Ayesha, A. (2017). Size biased Lindley distribution and its properties a special case of weighted distribution, Applied Mathematics, 8, 808-819.
•  Bagheri, S. F., Alizadeh, M., Baloui, J. E., and Nadarajah, S. (2014). Evaluation and comparison of estimations in the generalized exponential-Poisson distribution, Journal of Statistical Computation and Simulation, 84(11), 2345-2360.
•  Bagheri, S. F., Alizadeh, M., and Nadarajah, S. (2016a). Efficient estimation of the PDF and the CDF of the exponentiated Gumbel distribution, Communications in Statistics-Simulation and Computation, 45, 339-361.
•  Bagheri, S. F., Alizadeh, M., Nadarajah, S., and Deiri, E. (2016b). Efficient estimation of the PDF and the CDF of the Weibull extension model, Communications in Statistics-Simulation and Computation, 45(6), 2191-2207.
•  Bakouch, H. S., Al-Zahrani, M. B., Al-Shomrani, A. A., Marchi, A. A. V., and Louzada, F. (2012). An extended Lindley distribution, Journal of the Korean Statistical Society, 41, 75-85.
•  Bouchahed, L. and Zeghdoudi, H. (2018). A new and unified approach in generalizing the Lindley’s distribution with applications, Statistics in Transition, 19(1), 61-74.
•  Dixit, U. J. and Jabbari, N. M. (2010). Efficient estimation in the Pareto distribution, Statistical Methodology, 7, 687-691.
• 

Dixit, U. J. and Jabbari, N. M. (2011). Efficient estimation in the Pareto distribution with the presence of outliers,

Statistical Methodology, 8, 340-355.
•  Elbatal, I., Merovci, F., and Elgarhy, M. (2013). A new generalized Lindley distribution, Mathematical Theory and Modeling, 3(13), 30-47.
•  Ghitany, E. M., Al-Mutairi, K. D., Balakrishnan, N., and Al-Enezi, J. L. (2013). Power Lindley distribution and associated inference, Computational Statistics & Data Analysis, 64, 20-33.
•  Ghitany, M. E., Alqallaf, F., Al-Mutairi, D. K., and Hussain, H. A. (2011). A two parameter weighted Lindley distribution and its applications to survival data, Mathematics and Computers in Simulation, 81, 1190-1201.
•  Ghitany, M. E., Atieh, B., and Nadarajah, S. (2008). Lindley distribution and its application, Mathematics and Computers in Simulation, 78, 493-506.
•  Jabbari, N. M. and Jabbari, N. H. (2010). Efficient estimation of PDF, CDF and rth moment for the exponentiated Pareto distribution in the presence of outliers, Statistics: A Journal of Theoretical and Applied Statistics, 44(4), 1-20.
• 

Lindley, D. V. (1958). Fiducial distributions and Bayes’ theorem,

Journal of the Royal Statistical Society. Series B (Methodological), 20(1), 102-107.
•  Linhart, H. and Zucchini, W. (1986). Model Selection, John Wiley, USA, New York.
•  Maiti, S. S., Bhattacharya, A., and Saha, M. (2014). A new process capability index and its application to Lindley distributed characteristics, Proceedings of Institute for Mathematics, Bio-informatics, Information-technology and Computer-science, 3, 202-212.
•  Maiti, S. S. and Mukherjee, I. (2018). On estimation of the PDF and CDF of the Lindley distribution, Communication in Statistics-Simulation and Computation, 47(5),1370-1381.
•  Mazucheli, J. and Achcar, J. A. (2011). The Lindley distribution applied to competing risk life time data, Computer Methods and Programs in Biomedicine, 104(2), 188-192.
•  Mukherjee, I., Dey, S., and Maiti, S. S. (2016). Comparison of estimators of the PDF and CDF of the generalized exponential distribution, Proceedings of Institute for Mathematics, Bio-informatics, Information-technology and Computer-Science, 5, 266-281.
•  Mukherjee, I. and Maiti, S. S. (2019). A note on estimation of the PDF and CDF of the lognormal distribution, Proceedings of Institute for Mathematics, Bio-informatics, Information- technology and Computer-Science, 8, 163-174.
•  Mukherjee, S. and Maiti, S. S. (2014). Sampling inspection plan by variable for Lindley distributed quality characteristics, Proceedings of Institute for Mathematics, Bio-informatics, Information-technology and Computer-science, 3, 213-223.
•  Shanker, R. (2015). Akash distribution and its applications, International Journal of Probability and Statistics, 4(3), 65-75.
•  Shanker, R. (2016a). Amarendra distribution and its applications, American Journal of Mathematics and Statistics, 6(1), 44-56.
•  Shanker, R. (2016b). Aradhana distribution and its applications, International Journal of Statistics and Applications, 6(1), 23-34.
•  Shanker, R. (2016c). Devya distribution and its applications, International Journal of Statistics and Applications, 6(4), 189-202.
•  Shanker, R. (2016d). Shambhu distribution and its applications, International Journal of Probability and Statistics, 5(2), 48-63.
•  Shanker, R. (2016e). Sujatha distribution and its applications, Statistics in Transition, 17(3), 391-410.
•  Shanker, R. and Ghebretsadik, A. H. (2013). A new quasi Lindley distribution, International Journal of Statistics and Systems, 8(2), 143-156.
•  Tripathi, Y. M., Kayal, T., and Dey, S. (2017a). Estimation of the PDF and the CDF of a exponentiated moment exponential distribution, International Journal of System Assurance Engineering and Management, 8(2), 1282-1296.
•  Tripathi, Y. M., Mahato, A. K., and Dey, S. (2017b). Efficient estimation of the PDF and the CDF of a generalized logistic distribution,

Annals of Data Science

, 4(1), 63-81.
•  Zakerzadeh, H. and Dolati, A. (2009). Generalized Lindley distribution, Journal of Mathematical Extension, 3(2), 1-17.