Quantile Version of Mathai-Haubold Entropy of Order Statistics

2021-11-08 08:06IbrahimAlmanjahieJavidGaniDarAmerIbrahimAlOmariandAijazMir

Ibrahim M.Almanjahie,Javid Gani Dar,Amer Ibrahim Al-Omari and Aijaz Mir

1Department of Mathematics,College of Science,King Khalid University,Abha,62529,Saudi Arabia

2Statistical Research and Studies Support Unit,King Khalid University,Abha,62529,Saudi Arabia

3Department of Mathematical Sciences,IUST,Kashmir,192231,India

4Department of Mathematics,Faculty of Science,Al al-Bayt University,Mafraq,25113,Jordan

5Department of Mathematics,Govt.Degree College Kilam,Higher Education,J&K,192231,India

ABSTRACT Many researchers measure the uncertainty of a random variable using quantile-based entropy techniques.These techniques are useful in engineering applications and have some exceptional characteristics than their distribution function method.Considering order statistics, the key focus of this article is to propose new quantile-based Mathai-Haubold entropy and investigate its characteristics.The divergence measure of the Mathai-Haubold is also considered and some of its properties are established.Further,based on order statistics,we propose the residual entropy of the quantile-based Mathai-Haubold and some of its property results are proved.The performance of the proposed quantile-based Mathai-Haubold entropy is investigated by simulation studies.Finally, a real data application is used to compare our proposed quantile-based entropy to the existing quantile entropies.The results reveal the outperformance of our proposed entropy to the other entropies.

KEYWORDS Shannon entropy; Mathai-Haubold entropy; quantile function; residual entropy; order statistics; failure time;reliability measures

1 Introduction

The order statistics are considered in a varied scope of complicated problems, including characterization of a probability distribution, quality control, robust statistical estimation and identifying outliers, analysis of a censored sample, the goodness of fit-tests, etc.Based on order statistics, the usage of the recurrence relationships for moments is well recognized by many researchers (see, for instance, Arnold et al.[1], Malik et al.[2]).For an enhancement, many recurrence relations and identities for the order statistics moments originating from numerous particular continuous probability distributions (i.e., gamma, Cauchy, normal, logistic, and exponential) have been reviewed by Samuel et al.[3] and Arnold et al.[1].

Based on a random sample ofX1,X2,...,Xn, let the corresponding order statistics to beX1:n≤X2:n≤...≤Xn:n.Then, as in David [4] and Arnold et al.[1], the density ofXr:n,1 ≤r≤n,is

withCr:n=Eq.(1) can be used to determine the smallest (whenr=1) and largest(whenr=n) probability density functions; they, respectively, aref(1,n)(x)=n[1-F(x)]n-1f(x)andf(n,n)(x)=n[F(x)]n-1f(x).The corresponding distribution functions are obtained, respectively, byF(1,n)(x)=1-[1-F(x)]nandF(n,n)(x)=[F(x)]n.

Shannon [5] was the first author who introduced the entropy idea for a random variable (r.v.)Xin the field of information theory and defined it as

Here,fX(x)indicates the pdf of the r.v.X.Based on the Shannon entropy measure, Mathi et al.[6] (now onwards M-H entropy) developed a generalized version of Eq.(2) and defined it as

Whenα→1 the M-H entropy measureMα(X)will be reduced to the Shannon entropy measure defined in Eq.(2).

Mathai et al.[6] and Sebastian [7] discussed the main property allied with Eq.(3).In other words, applying the maximum entropy and using its normalization version together with energy restrictions will result in the well-recognized pathway-model as provided by Mathai [8].However,this model contains many special cases of familiar probability distributions.

Theoretical surveys and applications employing the measurement information are distributional dependents, and they may be found to be not appropriate in circumstances once the distribution is analytically not tractable.Hence, utilizations of quantile function are considered as an alternative method, where

We refer the readers to Nair et al.[9] and Sunoj et al.[10] and references therein for more details about quantile function.Recently, Sunoj et al.[11] studied Shannon entropy and as well as its residual and introduced quantile versions of them defined as

and

respectively, whereq(u)=denotes the density of quantile function.If we define the quantile density function byfQ(u)=f(Q(u)), then, we obtain

For Shannon past entropy, Sunoj et al.[11] also introduced its quantile version and defined it as

In the present paper, we work with the order statistics, propose the quantile-based version of M-H entropy and discuss its properties.The M-H divergence measure is also considered and we establish some of its distribution free properties.In addition, we introduce the version of the quantile-based residual for the M-H entropy and prove some characterization results.To the best of our knowledge, the results presented here, treat a research gap that has not been addressed or studied systematically by others, which was the primary motivation of our paper.

The paper is outlined as follows.Section 2 is devoted to the construction of our quantilebased M-H entropy and its properties.Next, expressions for the quantile-based version of M-H entropy for some life-time distributions are presented in Section 3.A quantile-based generalized divergence measure ofrthorder statistics is given in Section 4.Quantile Residual Entropy of M-H forrthorder statistics and also for some lifetime models are introduced in Section 5.Characterization theorems based on M-H Quantile Residual Entropy are presented in Section 6.In Section 7, simulation studies for investigating the performance of our proposed quantiles and real data life application are presented.Our conclusion is stated in Section 8.

2 Quantile Based M-H Entropy of rth Order Statistics

Wong et al.[12], Park [13], Ebrahimi et al.[14] and Baratpour et al.[15] are the authors who discuss in detail the aspects of information-theoretic based on order statistics.Paul et al.[16]considered the M-H entropy and, based on record values, studied some of its essential properties.Forrthorder statisticsXr;n, the M-H entropy is defined as

wherefr;n(x)is given in Eq.(1).Now,FQ(u)=u, then, the pdf ofrthorder statistics becomes

wheregr(u)denotes beta-distribution density withrand(n-r+1)as its parameters.The quantilebased M-H entropy ofXr;nis determined by

.

Remark 2.1:Forα→1, Eq.(6) reduces to

which is the quantile entropy ofrthorder statistics investigated by Sunoj et al.[10].

3 Expressions for Some Distributions

In the following, we provide expressions for Quantile-based M-H entropy of order statistics for some life time distributions:

(i)Govindarajulu’s Distribution:The quantile version and the corresponding density functions,respectively, are

Using Eq.(6), we can easily obtain quantile-based M-H Entropy ofrthorder statistics for Govindarajulu distribution as

Similarly, based on the quantile and quantile density functions, we obtain the quantile-based for the M-H Entropy (MαXr:n)ofrthorder statistics for the following distributions.

(ii)Uniform Distribution:

(iii)Pareto-I Distribution:

(iv)Exponential distribution:

(v)Power distribution

Figs.1-3 give the quantile version of M-H entropy plots of smallest order statistics under exponential, Pareto-I and uniform distributions, respectively.

Figure 1:Quantile M-H entropy plots of smallest order statistics (Exponential distribution)

For an increasing value of parametersαandλ, the entropy plot, based on the exponential distribution, increases.In the case of entropy plot under the Pareto-I distribution, the plot has an increasing (a decreasing) behaviour for different parameter combinations.The entropy plot under uniform distribution also has increasing behaviour for different parameter values.Tabs.1-3 give entropy values when the parametersαandλare varied.

Clearly, we see from Tabs.1-3 that the entropy values under exponential, Pareto-I and uniform distributions portray the same behaviour as discussed in the graphical plots.

4 Quantile-Based Generalized Divergence Measure of rth Order Statistics

Different measures deal with the dissimilarity or the distance between two probability distributions.Certainly, these measures are essential in theory, inferential statistics, applied statistics and data processing sciences, such as comparison, classification, estimation, etc.

Assumefandgare the density functions of the non-negative r.vsXandY, respectively.The direct divergence offfromgis measured by the Kullback et al.[17] and is

Figure 2:Quantile M-H entropy plots of smallest order statistics (Pareto-I Distribution)

Figure 3:Quantile M-H entropy plots of smallest order statistics (Uniform Distribution)

With orderα, the divergence measure of the M-H or the relative-entropy ofgconcerningfis obtained by

where, forα→1, the expression in Eq.(8) reduces to Eq.(7) (see Kullback et al.[17]).

Table 1:Quantile M-H Entropy value for the smallest order statistics, r=1,n=10 (Exponential distribution)

Table 2:Quantile M-H Entropy value for smallest order statistics, r = 1,n = 10 (Pareto-I Distribution)

Theorem 4.1:The quantile-based-generalized divergence measure between therthorder statistics distribution and the primary distribution is a distribution-free.

Proof:From equation Eq.(8), we have

Now, using the value offr:n(x)in Eq.(9), we obtain

Table 3:Quantile M-H Entropy value for smallest order statistics, r=1,n=10 (Uniform Distribution)

Using the fact thatq(u)f (Q(u))=1, we determine the quantile-based-generalized divergence measure between the distribution ofrthorder statistics and primary distribution, as

which is a distribution-free.Hence, the theorem is proved.

5 M-H Quantile Residual Entropy for rth Order Statistics

Entropy functions are very popular in the applications of finance and tectonophysics, machine learning, reliability theory, etc.However, in reliability and real-life applications, the life test time is truncated at a specific time, and in such situations, Eq.(2) is not an appropriate measure.Therefore, Shannon’s entropy is not an adequate measure when we have knowledge about the component’s current age, which can be used when determining its uncertainty.Ebrahimi [14]describes a more practical approach that considers the use of age, defined as

The M-H residual entropy for therthorder statistics is given by

Considering therthorder statistic, the quantile residual entropy function of M-H of is given by

The following theorem will state important result.

Theorem 5.1:Considering therthorder statistic, the quantile residual entropy function of M-H determines the underlying distribution uniquely.

Proof:Using equation Eq.(10), we obtain

Differentiate both sides with respect to (w.r.t)uto obtain

where ′ denotes the differentiation w.r.tu.This equation involved an immediate connection between theq(u)andwhich implies that the quantile residual entropy function of M-H ofrthorder statistic leads to the unicity of the underlying distribution.

Next, we make the derivation of the quantile form of M-H residual entropy of therthorder statistic for some lifetime models.

(i)Govindarajulu’s Distribution

The quantile for the Govindarajulu distribution is

and the corresponding density is

The quantile residual entropy function of M-H ofrthorder statistics for the distribution of Govindarajulu is

Similarly, based on the quantile and quantile density functions, we obtain the quantile-based residual M-H Entropy ofrthorder statistics for the following distributions.

(ii)Uniform Distribution

Q(u)=a+(b-a)uandq(u)=(b-a), 0 ≤u≤1;a <b.

(iii)Pareto-I Distribution

(iv)Exponential distribution

(v)Power distribution

Based on residual M-H quantile entropy of order statisticsMα(Xr;n,u), the following nonparametric classes of life distribution are defined.

Definition 5.1:Xis said to have an increasing (a decreasing) M-H quantile entropy of order statistics ifMα(Xr;n,u)is increasing (decreasing) inu≥0.

The following lemma is useful in proving the results in monotonicity ofMα(Xr;n,u).

Lemma 5.1:Letf (u,x):R2+→R+andg:R+→R+be any two functions.Ifis increasing andg(u)is increasing (decreasing) inu, thenis increasing (decreasing)inu, provided the existence of integrals.

Theorem 5.2:LetXbe a non-negative and continuous r.v.with quantileQX(.)and densityqX(.).DefineY= ∅(X), where ∅(.)is nonnegative, increasing and convex(concave) function.Then,

(i) For 1<α <2,Mα(Yr;n,u)increases (or decreases) inuwheneverMα(Xr;n,u)increases (or decreases) inu.

(ii) For 0<α <1,Mα(Yr;n,u)increases (or decreases) inuwheneverMα(Xr;n,u)increases (or decreases) inu.

Proof:(i) The quantile density ofYis given by

Thus, we have

From the given condition,Mα(Xr;n,u)is increasing inu, therefore,

is increasing inu.

Since 1<α <2 and ∅ is non-negative, increasing and convex (concave) function, the(∅′(QX(p))α-1increases (or decreases) and it is also non-negative.Consequently, using Lemma 3.1, Eq.(11) is increasing (decreasing), which gives the proof of (i) of the Theorem.Similarly,0<α <1,(∅′(QX(p))α-1increases (or decreases) inp, because ∅increases and it is convex.Consequently, Eq.(11) is decreasing (increasing) inu, which proves (ii) of the Theorem.The immediate application of Theorem 5.2 is given below:

LetXbe an r.v.following the distribution of exponential and having a failure rateλ.Also, letTherefore,Yfollows Weibull distribution whereThe functionφ(x)=,x >0,α >0 is a convex (concave) if 1<α <2,(0<α <1).Hence, based on Theorem 5.2, the Weibull distribution is increasing (decreasing) M-H quantile entropy of order statistics if 1<α <2,(0<α <1).

6 Characterization Theorems Based on M-H Quantile Residual Entropy

This section provides some characterizations for the quantile M-H residual entropies of the smallest and largest order statistics.The corresponding quantile M-H residual entropy can be determined by substitutingr= 1 (for smallest) andr=n(for largest) in Eq.(10) and are,respectively, given by

Now, we define the hazard and the reversed hazard functions for the quantile version which are, respectively, corresponding to the well-recognized hazard rate and reversed hazard rate functions, as

In numerous practical circumstances, the uncertainty is essentially not identified with the future.Therefore, it can likewise allude to the past.This thought empowered Crescenzo et al.[19]to build up the idea of past entropy on(0,t).IfXdenotes the life-time of a component, thus,the past entropy ofXis obtained by

withF(t)is the cumulative distribution function.Fort=0, Eq.(13) reduces to Eq.(2).

The quantile form of past M-H residual entropy ofrthorder statistic is determined by

For sample maximaXn;n, the past quantile entropy of M-H is given by

Next, we state some properties based on quantile M-H residual entropy of the smallest order statistics.

Theorem 6.1:LetX1;nrepresents the first order statistics with survival and hazard quantile functionsandKX1;n(u).Then, theis determined by

if and only if:

(i)Xfollows an exponential distribution, if

(ii)Xfollows Pareto distribution where the quantile density function isq(u)=

(iii)Xfollows a finite range distribution with quantile density functionq(u)=

Proof:Assume that the conditions in Eq.(15) are held.Then, using Eq.(12), we have

Therefore, usingin Eq.(16), then, differentiating (w.r.t)u, we obtain

This impliesq(u)=whereAis a constant.Thus,Xhas exponential distribution, Pareto distribution and finite range distribution ifrespectively.

Theorem 6.2:For the exponential distribution, the difference between quantile-based M-H residual entropy of the life-time of the series systemand M-H residual quantile entropy of the life-time of each component(Mα(X;u))is independent ofuand relies only onαand the number of components of the system.

Proof:For the exponential distribution, we have

Therefore,

which complete the prove of Theorem 6.2.

Theorem 6.3:LetXn;nrepresents the largest order statistics with hazard and survival quantile functions,Then, for sample maximaXn;n, the past quantile entropy of M-H,is

if and only ifXfollows the power distribution.

Proof:The quantile and quantile density functions for the power distribution are, respectively,

Conversely, let Eq.(17) is valid.Thus, using Eq.(14), we determine

Taken the derivative (w.r.t)uyieldsThe latter gives

whereAdenotes a constant.Hence, the power distribution is a characterized for

7 Simulation Study and Application to Real Life Data

In this paper, the quantile-based M-H entropy is proposed for some distributions.However,based on the available real data and to keep the simulation study related to the application part, we investigate the performance of the quantile-based M-H entropy for the exponential distribution.

7.1 Simulation Study

We conducted simulation studies to investigate the efficiency of the quantile-based M-H entropy estimators of smallest order statistics for exponential distributionin terms of the average bias (Bias), variance and mean squared error (MSE), based on sample sizes 10, 25, 100,200 and 500 for different parameter combinations.The estimation of parameterλwas achieved using ML estimation and the process was repeated 2000 times.

From the results of the simulation study (see Tabs.4 and 5), conclusions are drawn regarding the behaviour of the entropy estimator in general, which are summarized below:

(1) The ML estimates ofapproaches to true value when sample sizenincreases.

(2) When sample sizenis increased, the MSE and variance ofdecreases.

Table 4:Average estimates, Bias, Variance and Mean Squared Error for under exponential distribution for different values of λ and fixed value of α=0.2

Table 4:Average estimates, Bias, Variance and Mean Squared Error for under exponential distribution for different values of λ and fixed value of α=0.2

n Criterion λ=0.5,α=0.2 λ=0.8,α=0.2 λ=1.5,α=0.2 10 Eimages/BZ_99_555_2066_579_2112.pngMαX1:nimages/BZ_99_689_2066_713_2112.png-4.071545 0.529750 0.513037 Bias -0.460029 -0.047345 -0.064059 Variance 9.852729 0.175090 0.183407 MSE 10.064355 0.177332 0.187510

Table 4(continued)

Table 5:Average estimates, Bias, Variance and Mean Squared Error for under exponential distribution for different values of α and fixed value of λ=1.3

Table 5:Average estimates, Bias, Variance and Mean Squared Error for under exponential distribution for different values of α and fixed value of λ=1.3

n Criterion λ=1.3,α=0.1 λ=1.3,α=0.9 λ=1.3,α=1.5 10 Eimages/BZ_100_576_1975_601_2021.pngMαX1:nimages/BZ_100_710_1975_735_2021.png0.322104 1.655315 -0.099460 Bias -0.034907 -0.010507 -0.028988 Variance 0.231985 7.787576 0.091822 MSE 0.233204 7.787687 0.092662 25 Eimages/BZ_100_576_2240_601_2285.pngMαX1:nimages/BZ_100_710_2240_735_2285.png0.387186 2.185596 -0.189670 Bias -0.020296 -0.007890 -0.013950 Variance 0.082536 3.056875 0.033740 MSE 0.082948 3.056938 0.033935 100 Eimages/BZ_100_576_2504_601_2550.pngMαX1:nimages/BZ_100_710_2504_735_2550.png0.425782 2.430850 -0.232748 Bias -0.003539 -0.000279 -0.004406 Variance 0.017327 0.737979 0.007372 MSE 0.017340 0.737979 0.007392

Table 5(continued)

7.2 Application to Real Life Data

The real data in this section represents the failure times of 20 mechanical components that were used previously by Murthy et al.[20] for investigating some of Weibull models.The data values are:0.067, 0.068, 0.076, 0.081, 0.084, 0.085, 0.085, 0.086, 0.089, 0.098, 0.098, 0.114, 0.114,0.115, 0.121, 0.125, 0.131, 0.149, 0.160, 0.485.We use this data for two main purposes:(i)for investigating the performance of our quantile-base M-H entropy () in the exponential distribution case, and (ii) for comparingto the quantile-based Tsallis entropythat was proposed by Kumar [21].

Based on this data, we used first the maximum likelihood method to estimate the exponential distribution parameter,= 0.122.Then, for different values ofα, varied from 0.1 to 0.9, we calculate the estimated values ofandof smallest order statistics under exponential distribution.The results are displayed in Tab.6.

Table 6:Estimates of and for different values of α

Table 6:Estimates of and for different values of α

α 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 ˆMαX1:n 0.948 1.024 1.209 1.205 1.314 1.439 1.582 1.747 1.937 HαX1:n 9.625 6.846 5.546 4.678 4.023 3.502 3.074 2.716 2.415

It should be noted that the estimated values ofis generally increased when 0<α <1.Also, the results in Tab.6 indicate clearly that the estimated entropy values based onare less than those given by.

8 Conclusion

The key focus of this article is to propose new quantile-based Mathai-Haubold entropy and investigate its characteristics.We also considered the divergence measure of the Mathai-Haubold and established some of its properties.Further, based on order statistics, we propose the residual entropy of the quantile-based Mathai-Haubold and some of its property results are proved.The performance of the proposed quantile-based Mathai-Haubold entropy is investigated by simulation studies and also by using real data application example.The proposed quantilebased Mathai-Haubold entropy’s performance is investigated by simulation studies and by using a real data application example.We found that the ML estimates ofapproach true value when sample sizenincreases for the simulation part.For the application part, we compared our proposed quantile-based entropy to the existing quantile entropies and the results showed the outperformance of our proposed entropy to the other entropies.Our proposed quantile-based Mathai-Haubold entropy is useful for many future engineering applications such as reliability and mechanical components analysis.

Quantile functions are efficient and equivalent alternatives to distribution functions in modeling and analysis of statistical data.The scope of these functions and the probability distributions are essential in studying and analyzing real lifetime data.One reason is that they convey the same information about the distribution of the underlying random variable X.However, even if sufficient literature is available on probability distributions’characterizations employing different statistical measures, little works have been observed for modeling lifetime data using quantile versions of order statistics.Therefore, future work is necessary for enriching this area, and for this reason, we give precise recommendations for future research.First, the results obtained in this article are general because they can be reduced to some of the results for quantile based Shannon entropy for order statistics once parameter approaches unity.Recently, a quantile version of generalized entropy measure for order statistics for residual and past lifetimes was proposed by Kumar et al.[22].Nisa et al.[23] presented a quantile version of two parametric generalized entropy of order statistics residual and past lifetimes and derived some characterization results.Moreover, Qiu [24] studied further results on quantile entropy in the past lifetime and gave the quantile entropy bounds in the past lifetime for some ageing classes.The ideas presented by these mentioned papers can be somehow combined/merged with our results in this paper to produce more results and properties for the quantile-based Mathai-Haubold.Second, Krishnan [25]recently introduced a quantile-based cumulative residual Tsallis entropy (CRTE) and extended the quantile-based CRTE in the context of order statistics.Based on these new results and our proposed quantile M-H entropy in this paper, one can follow Krishnan [25] and derive a quantilebased cumulative residual M-H entropy and extend it in the context of order statistics.Finally,Krishnan [25] also proposed a cumulative Tsallis entropy in a past lifetime based on quantile function.As an extension, the cumulative M-H entropy in a past lifetime based on quantile function can also be derived.

Acknowledgement:The authors would like to thank the anonymous reviewers and the editor for their useful suggestions and comments which increase and improve the quality of this paper.

Funding Statement:Authors thank and appreciate funding this work by the Deanship of Scientific Research at King Khalid University through the Research Groups Program under the Grant No.(R.G.P.2/82/42).

Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.