Next Article in Journal
TMEA: A Thermodynamically Motivated Framework for Functional Characterization of Biological Responses to System Acclimation
Next Article in Special Issue
Baseline Methods for Bayesian Inference in Gumbel Distribution
Previous Article in Journal
Investigating the Influence of Inverse Preferential Attachment on Network Development
Previous Article in Special Issue
A Dirichlet Process Prior Approach for Covariate Selection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Bayesian Inference for the Kumaraswamy Distribution under Generalized Progressive Hybrid Censoring

Department of Mathematics, Beijing Jiaotong University, Beijing 100044, China
*
Author to whom correspondence should be addressed.
Submission received: 24 July 2020 / Revised: 9 September 2020 / Accepted: 12 September 2020 / Published: 15 September 2020
(This article belongs to the Special Issue Bayesian Inference and Computation)

Abstract

:
Incomplete data are unavoidable for survival analysis as well as life testing, so more and more researchers are beginning to study censoring data. This paper discusses and considers the estimation of unknown parameters featured by the Kumaraswamy distribution on the condition of generalized progressive hybrid censoring scheme. Estimation of reliability is also considered in this paper. To begin with, the maximum likelihood estimators are derived. In addition, Bayesian estimators under not only symmetric but also asymmetric loss functions, like general entropy, squared error as well as linex loss function, are also offered. Since the Bayesian estimates fail to be of explicit computation, Lindley approximation, as well as the Tierney and Kadane method, is employed to obtain the Bayesian estimates. A simulation research is conducted for the comparison of the effectiveness of the proposed estimators. A real-life example is employed for illustration.

1. Introduction

1.1. Kumaraswamy Distribution

Given that the classical probability distribution functions like beta, normal, log-normal, Student-t (Contreras-Reyes et al. [1]) and other empirical distributions cannot fit hydrological data quite well, Kumaraswamy [2] came up with a new two-parameter distribution to be specifically applicable for hydrological problems. The cumulative distribution function (cdf) of the Kumaraswamy distribution is
F ( x ) = 1 ( 1 x α ) β , 0 x 1 ,
where both α and β represent the positive shape parameters of the distribution, which is denoted by K( α , β ) in this paper. The following presents the corresponding probability density function (pdf):
f ( x ) = α β x α 1 ( 1 x α ) β 1 , 0 x 1 .
In addition, an in-depth observation is made on the reliability function of the Kumaraswamy distribution, which is shown below:
R ( t ) = ( 1 t α ) β , t > 0 .
An increasing number of statisticians have started to study it since it has many flexible shape properties which are identical with the beta distribution. In light of different values of parameters, the pdf of the Kumaraswamy distribution can appear to have diverse shapes. It is uniantimodal if α , β < 1 ; it is unimodal if α , β > 1 ; it is decreasing if α 1 , β > 1 ; it is increasing if α > 1 , β 1 ; and it is constant if α = β = 1 . Figure 1 and Figure 2 illustrate the pdf and cdf respectively for the above five cases, namely α = 2 , 0.5 , 5 , 1 , 1 and β = 5 , 0.5 , 1 , 3 , 1 . One may refer to Mitnik [3] for more information. Futher, assigning different value combinations of ( α , β ), we can convert the Kumaraswamy distribution into some other distributions, like uniform, exponential, beta.
In comparison, the Kumaraswamy distribution is superior to the beta distribution because the cdf of the beta distribution contains a form of integral which cannot be simplified. However, the cdf of the Kumaraswamy distribution is in explicit expression, which results in some advantages of tractability.
What is more, the Kumaraswamy distribution fits well with some natural phenomena, such as daily rainfall, water flows and other pertinent fields, see Fletcher and Ponnambalam [4], Sundar and Subbiah [5], Ponnambalam et al. [6] as well as Seifi et al. [7], especially for the outcomes of which possess upper and lower bounds, like the heights of people, test scores, air temperatures, economic data, etc. Estimation for the Kumaraswamy distribution has gradually attracted the attention of scholars in recent years. Lemonte [8] obtained modified maximum likelihood estimators which are unbiased in the second stage. Based on the bias correction estimations, they studied a bias-corrected method using parametric bootstrap. Then, statisticians attempted to study the Kumaraswamy distribution under different censoring schemes and started to combine classical and Bayesian methods. Ghosh and Nadarajah [9] discussed Bayesian estimation using two loss functions under three types of censoring schemes: left censoring, single type-II censoring and double type-II censoring with one parameter known. Sultana et al. [10] discussed and estimated parameters of the Kumaraswamy distribution with hybrid censoring scheme, and recently Sultana et al. [11] combined hybrid with progressive type I censoring schemes to explore the parameter estimation problems for the same distribution.

1.2. Generalized Progressive Hybrid Censoring Scheme

Life testing experiments are widely used in engineering, biology, machinery and other fields of study, which can be summarized as mathematical and probabilistic models of survival analysis. In reality, several restrictions, such as time and cost, prevent us from observing the failure time of all units. It is common to cease in the middle of process before all the observations fail. Such limitations result in censored data. Among all censoring cases, the two most typical schemes are Type-I as well as Type-II censoring schemes. According to previous literature, plenty of authors have discussed this aspect and one may consult Meeker and Escobar [12] which includes methods of handling Type-I as well as Type-II censored data.
An accidental pause or an unavoidable loss of the experiment units is likely to happen before the final termination. However, the constraint in those two censoring schemes is that removal cannot occur to the units in the duration of the experiment. To resolve this inflexibility, Cohen [13] first introduced progressive censoring scheme. A progressive Type-II censoring sample will be given below. Assuming that n independent units of a common lifetime distribution denoted by X 1 , X 2 , , X n are put in the experiment at t = 0, when the first failure occurs, among n 1 survivals, we take R 1 units out of the experiment at random. Similarly, when the second failure happens, we randomly remove R 2 units among the n 2 R 1 survivals. We conduct the repeated procedure till the observation of the mth failure. On the occasion of the m-th failure, removal occurs to the R m = n m R 1 R 2 R m 1 survivals. During the experiment, the progressive censoring scheme ( R 1 , R 2 , , R m ), which is considered to be progressive type-II censored scheme, is prefixed satisfying i = 1 m R i + m = n . m ordered failure times are written in light of X 1 < X 2 < < X m .
According to weakness harbored by the progressive type-II censored scheme, if the experimental units are highly reliable, this experiment will last quite long. Hence, the progressively hybrid censoring scheme was introduced by Kundu and Joarder [14]. For the censoring scheme, the implementation of n independent identical distributed units is used. The experimenter will cease the operation at min { T , X m } . Here, the time T as well as 1 m n is determined ahead of time. In the context of the progressive type-II censored scheme, the span of the experiment will not take a longer duration than T.
However, given that when the prefixed termination time T may be small, the observation we obtained would be insufficient. Therefore, a new mode of censoring scheme—generalized progressive hybrid censoring scheme—is proposed by Cho and Sun [15], which enables us to obtain a predetermined series of failures. How to get generalized progressive hybrid censored data is described below by a graphic illustration in Figure 3.
Assume that our research group possesses n independent units of a common lifetime distribution. The corresponding lifetime is denoted by X 1 , X 2 , , X n . The integers k and m ( k < m ) have been under predetermination between zero and n as well as R 1 , R 2 , , R m which can satisfy the equation i = 1 m R i + m = n function as preplanned integers. On the arrival of the first failure X 1 , we randomly remove R 1 units. When the second failure X 2 happens, we take random removal of R 2 units out of the n 2 R 1 survivals. The process is repeated and terminated at T * = max { min { T , X m } , X k } with the rest of the survival units under the removal. It greatly modified the previous schemes so that we can choose to continue the experiment when the sample is insufficient at the prefixed cut-off time T. On the condition of the generalized progressive hybrid censoring scheme, researchers would like to obtain m failures, while they can also adopt k failures which are regarded as the bare minimum. We denote the generalized progressive hybrid censoring scheme as ( R 1 , R 2 , , R m ). Let J be the observed failure times before arriving at the predetermined time T. The generalized progressive censoring scheme can be classified into these cases as below:
Case I : X 1 , , X J , , X k , for T < X k < X m , Case II : X 1 , , X k , , X J , for X k < T < X m , Case III : X 1 , , X k , , X m , for X k < X m < T .
Case III represents the progressive Type-II censoring scheme and the mixture of Case II and Case III is the progressive hybrid censoring scheme. Hence, evidently, Case I is the modification of this scheme. Let R 1 = 0 , R 2 = 0 , , R m = 0 , we can obtain complete sample. We also assume R 1 = = R m 1 = 0 , R m = n m , in order to get the type-II censored sample.
According to the generalized progressive hybrid censoring scheme, the likelihood equations of the three cases are
Case I : L 1 = Q 1 j = 1 k 1 f ( x j ) 1 F ( x j ) R j f ( x k ) 1 F ( x k ) R k * , Case II : L 2 = Q 2 j = 1 J f ( x j ) 1 F ( x j ) R j 1 F ( T ) R J + 1 * , Case III : L 3 = Q 3 j = 1 m f ( x j ) 1 F ( x j ) R j ,
where Q 1 = j = 1 k k = j m ( R k + 1 ) , Q 2 = j = 1 J k = j m ( R k + 1 ) , Q 3 = j = 1 m k = j m ( R k + 1 ) , R k * = n k i = 1 k 1 R i = R k and R J + 1 * = n J i = 1 J R i .
As far as we know, estimations for the Kumaraswamy distribution under generalized progressive hybrid censoring scheme have not been done yet in previous literature. Hence, this paper will discuss the estimations for parameters and reliability of the Kumaraswamy distribution on the basis of this model.
The rest of the article consists of the following parts. In the next section, the maximum likelihood estimators of the two unknown parameters as well as reliability function of the model will be derived theoretically. For all unknown quantities, Bayes estimators are achieved in Section 3 under three diverse loss functions employing Lindley’s approximation. Then in Section 4, a simulation experiment will be carried out in light of the conclusion in Section 2 and Section 3. Data analysis is demonstrated in Section 5. Finally, Section 6 concludes the thesis.

2. Maximum Likelihood Estimation

In dealing with reliability problems and survival analysis, an effective and classical approach widely employed by statisticians is maximum likelihood estimation (MLE). By employing this method, two unknown parameters will be derived. As a result, MLE of R ( t ) will also be acquired. Plugging the pdf and cdf of the Kumaraswamy distribution, i.e., (2) and (1), into the likelihood Formula (4), the likelihood functions of α and β after neglecting the constants are expressed as
Case I : L 1 ( α β ) k j = 1 k x j α 1 ( 1 x j α ) β ( 1 + R j ) 1 , Case II : L 2 ( α β ) J ( 1 T α ) β R J + 1 * j = 1 J x j α 1 ( 1 x j α ) β ( 1 + R j ) 1 , Case III : L 3 ( α β ) m j = 1 m x j α 1 ( 1 x j α ) β ( 1 + R j ) 1 .
Disregarding the constant, the log-likelihood functions are
Case I:
l 1 k log α β + ( α 1 ) j = 1 k log x j + β j = 1 k ( 1 + R j ) log ( 1 x j α ) j = 1 k log ( 1 x j α ) ,
Case II:
l 2 J log α β + ( α 1 ) j = 1 J log x j + β j = 1 J ( 1 + R j ) log ( 1 x j α ) + R J + 1 * log ( 1 T α ) j = 1 J log ( 1 x j α ) ,
Case III:
l 3 m log α β + ( α 1 ) j = 1 m log x j + β j = 1 m ( 1 + R j ) log ( 1 x j α ) j = 1 m log ( 1 x j α ) .
In order to simplify the above expressions, we combine Case I, II and III and obtain the log-likelihood function as
l D log α β + ( α 1 ) j = 1 D log x j + β j = 1 D ( 1 + R j ) log ( 1 x j α ) + E ( α ) j = 1 D log ( 1 x j α ) ,
where for Case I, D = k , E ( α ) = 0 ; for Case II, D = J , E ( α ) = R J + 1 * log ( 1 T α ) ; for Case III, D = m , E ( α ) = 0 .
We take the partial derivatives of the above function (5) for α and β respectively and get a set of likelihood equations as follows:
l β = D β + j = 1 D ( 1 + R j ) log ( 1 x j α ) + E ( α ) = 0 ,
and
l α = D α + j = 1 D log x j + β j = 1 D ( 1 + R j ) x j α log x j 1 x j α + E ( 1 ) ( α ) + j = 1 D x j α log x j 1 x j α ,
where
E ( 1 ) ( α ) = 0 for Case I and III , R J + 1 * T α log T 1 T α for Case II .
By solving the roots of the above set of equations, the MLEs of two parameters are able to be attained theoretically. From the Equation (6), we gain the maximum likelihood estimate of β as
β ^ ( α ) = D j = 1 D ( 1 + R j ) log ( 1 x j α ) + E ( α ) .
Placing the estimation value of β into the Equation (7), we can get
g ( α ) = α ,
where
g ( α ) = D j = 1 D log x j + β ^ ( α ) [ j = 1 D ( 1 + R j ) x j α log x j 1 x j α + E ( 1 ) ( α ) ] + j = 1 D x j α log x j 1 x j α .
Obviously, it is hard to simplify and gain solutions of closed forms, since both (8) and (9) are nonlinear. In this situation, updating the estimates seems to be an effective method to gain approximate solution of α . This iterative algorithm has been proposed by Kundu [16]. Here, we give a brief description. Begin with an initial assumption of α , noted by α ( 0 ) , then gain α ( 1 ) = g ( α ( 0 ) ) and repeat this iteration and we gain α ( n + 1 ) = g ( α ( n ) ) . When the precision meet the tolerance limit which is set beforehand | α ( n + 1 ) α ( n ) | < ε , we stop the iterative process. Once we get the MLE of α , denoted by α ^ , the MLE of β is deduced as β ^ = β ^ ( α ^ ) . As long as the MLEs are obtained, we can substitute these two estimates into (3) to gain the MLE of R ( t ) as:
R ^ ( t ) = ( 1 t α ^ ) β ^ , t > 0 .

3. Bayesian Estimation

Bayesian estimation which considers prior information as well as sample information is a fresh but efficient approach in comparison with MLE. This more comprehensive estimation method is usually more precise than the maximum likelihood estimation. In this part, Bayesian estimation of the model parameters α , β and reliability function will be discussed.

3.1. Symmetric and Asymetric Loss Functions

In statistics, we usually estimate a parameter by minimizing the loss function. There are lots of diverse symmetric or asymmetric loss functions. Here we will interpret the three typical loss functions which are taken into consideration. In all of the following cases, d ( η ) stands for the true value of the unknown parameter and d ^ ( η ) is the corresponding estimate of d ( η ) . The symmetric one refers to the squared error loss (SEL) function. It is the most prevalent one and can be easily proved to be right based on minimum variance-unbiased estimation. The definition and corresponding Bayes estimator are
L S d ( η ) , d ^ ( η ) = d ^ ( η ) d ( η ) 2 ,
d ^ S E L = E η ( η | x ̲ ) .
However, due to symmetry, the overestimation of SEL function has equal weight as underestimation of the same magnitude, which gives rise to the emergence of a large number of asymmetric functions. The Linex loss (LL) function [17], an extensively adopted asymmetric loss function, is another loss function discussed in this paper. The definition and corresponding Bayes estimator are
L L d ( η ) , d ^ ( η ) = e p d ^ ( η ) d ( η ) p d ^ ( η ) d ( η ) 1 , p 0 ,
d ^ L L = 1 p log E η ( e p η | x ̲ ) .
The parameter p represents the deviation direction, and the degree of deviation is reflected by its magnitude. When p < 0 , the underestimation is greater than the overestimation and the opposite is the case when p > 0 . When parameter p converges towards zero, the linex loss function can be converted to SEL loss function.
In addition, an asymmetric loss function—the general entropy loss (EL) function is also considered whose definition and corresponding Bayes estimator are
L E d ( η ) , d ^ ( η ) = d ^ ( η ) d ( η ) q q log d ^ ( η ) d ( η ) 1 , q 0 ,
d ^ E L = E η ( η q | x ̲ ) 1 q .
Here, the positive error is greater than the negative one when q > 0 , and the opposite is the case when q > 0 .

3.2. Prior and Posterior Distributions

Since a natural conjugate bivariate prior distribution for α and β does not exist, we employ the same assumption as Kundu and Pradhan [18], supposing that α and β are subjected to gamma distributions independently for the reason that gamma distribution can be adapted to various shapes depending on parameter values. Now, the joint prior distribution, ignoring the constant coefficient, is in the form of
π ( α , β ) α a 1 β c 1 e b α e d β , α > 0 , β > 0 .
Here, the positive hyperparameters a , b , c , d embody the prior knowledge and information about α and β . On this basis, the joint posterior distribution is
π ( α , β , X ) α D + a 1 β D + c 1 e β ( E ( α ) d ) b α j = 1 D x j α 1 ( 1 x j α ) β ( 1 + R j ) 1 ,
where X are observations X 1 , X 2 , . The conditional posterior distribution is defined by
π ( α , β | X ) = π ( α , β , X ) 0 0 π ( α , β , X ) d α d β .
Under the SEL function, the Bayes estimators can be gained as
ψ ^ S = 0 0 ψ π ( α , β , X ) d α d β 0 0 π ( α , β , X ) d α d β ,
where ψ stands for α , β or reliability function.
Under the LL function, the Bayes estimators are obtained as
ψ ^ L = 1 p log { 0 0 e p ψ π ( α , β , X ) d α d β 0 0 π ( α , β , X ) d α d β } ,
Under the EL function, the Bayes estimators are written as
ψ ^ E = 0 0 ψ q π ( α , β , X ) d α d β 0 0 π ( α , β , X ) d α d β 1 q .
By observing the above Bayes estimation expressions, we find that they are all the ratios of two integrals and the explicit expressions are hard to acquire. Therefore, proper methods to approximate the above integrals need to be employed. Thus, we introduce Lindley’s approximation, as well as the Tierney and Kadane method, to get the closed estimators.

3.3. Lindley’s Approximation

Lindley [19] deduced this general term formula by developing asymptotic expansions for the ratio of integrals. To apply this method, we denote μ = ( μ 1 , μ 2 ) and consider the function g ( μ 1 , μ 2 ) in the form of ratio of intergrals given by
E ( g ( μ 1 , μ 2 ) ) = 0 0 g ( μ 1 , μ 2 ) π ( μ 1 , μ 2 , X ) d μ 1 d μ 2 0 0 π ( μ 1 , μ 2 , X ) d μ 1 d μ 2 = 0 0 g ( μ 1 , μ 2 ) e l ( μ 1 , μ 2 | X ) + κ ( μ 1 , μ 2 ) d μ 1 d μ 2 0 0 e l ( μ 1 , μ 2 | X ) + κ ( μ 1 , μ 2 ) d μ 1 d μ 2 ,
where g ( μ 1 , μ 2 ) is a certain function of μ 1 and μ 2 , l ( μ 1 , μ 2 | X ) is the log-likelihood function and κ ( μ 1 , μ 2 ) = log π ( μ 1 , μ 2 ) . Employing the Lindley method, the function g ( μ 1 , μ 2 ) can be written as
g ^ = g ( μ ^ 1 , μ ^ 2 ) + 0.5 [ A + l 03 B 21 + l 30 B 12 + l 12 C 21 + l 21 C 12 ] + κ 1 A 12 + κ 2 A 21 ,
where
A = i = 1 2 j = 1 2 w i j τ i j , l i j = i + j l ( μ 1 , μ 2 ) μ 1 i μ 2 j , i , j = 0 , 1 , 2 , 3 , i + j = 3 , κ i = κ μ i , w i = g μ i , w i j = 2 g μ i μ j , A i j = w i τ i i + w j τ j i , B i j = ( w i τ i i + w j τ i j ) τ i i , C i j = 3 w i τ i i τ i j + w j ( τ i i τ j j + 2 τ i j 2 ) .
We further observe that τ i j = ( i , j ) th element of the inverse matrix [ 2 l ( μ 1 , μ 2 | X ) μ 1 i μ 2 j ] 1 .
For our estimation problem, μ = ( α , β ) and now let us deduce it in detail. For convenience, we denote
τ 11 = H M , τ 22 = G M , τ 12 = τ 21 = I M , M = G H I 2 ,
where
G = 2 l α 2 = D α 2 β j = 1 D ( 1 + R j ) x j α ( log x j ) 2 ( 1 x j α ) 2 + E ( 2 ) ( α ) j = 1 D x j α ( log x j ) 2 ( 1 x j α ) 2 , H = 2 l β 2 = D β 2 , I = 2 l α β = j = 1 D ( 1 + R j ) x j α log x j 1 x j α + E ( 1 ) ( α ) .
Also, for our problem we have
l 30 = 3 l α 3 = 2 D α 3 + β j = 1 D ( 1 + R j ) ( log x j ) 3 x j α ( 1 + x j α ) ( 1 x j α ) 3 + E ( 3 ) ( α ) + j = 1 D ( log x j ) 3 x j α ( 1 + x j α ) ( 1 x j α ) 3 , l 21 = 3 l α 2 β = j = 1 D ( 1 + R j ) x j α ( log x j ) 2 ( 1 x j α ) 2 + E ( 2 ) ( α ) , l 12 = 3 l α β 2 = 0 , l 03 = 3 l β 3 = 2 D β 3 , κ 1 = a 1 α b , κ 2 = c 1 β d ,
where
E ( 2 ) ( α ) = 0 for Case I and III , R J + 1 * T α ( log T ) 2 ( 1 T α ) 2 for Case II ,
E ( 3 ) ( α ) = 0 for Case I and III , R J + 1 * T α ( log T ) 3 ( 1 + T α ) ( 1 T α ) 3 for Case II .
  • Under the SEL function,
    -
    when g ( α , β ) = α , we observe that
    w 1 = 1 , w 2 = w 11 = w 12 = w 21 = w 22 = 0 .
    Using the above Equation (14), the Bayes estimator of α can be obtained as
    α ^ S = α ^ + 1 M ( H κ 1 I κ 2 ) + 0.5 M 2 H 2 l 30 G I l 03 3 H I l 21 .
    -
    When g ( α , β ) = β , we can derive that
    w 2 = 1 , w 1 = w 11 = w 12 = w 21 = w 22 = 0 .
    Likewise, the Bayes estimator of β can be gained as
    β ^ S = β ^ + 1 M ( I κ 1 + G κ 2 ) + 0.5 M 2 H I l 30 + G 2 l 03 + ( H G + 2 I 2 ) l 21 .
    -
    Let g ( α , β ) = R ( t ) , we have
    w 1 = β ( 1 t α ) β 1 ( t α log t ) , w 2 = ( 1 t α ) β log ( 1 t α ) , w 11 = β t α ( log t ) 2 ( 1 t α ) β 2 ( β t α 1 ) , w 22 = ( 1 t α ) β log ( 1 t α ) 2 , w 12 = w 21 = t α log t ( 1 t α ) β 1 1 + β log ( 1 t α ) .
    The Bayes estimator of R ( t ) can be computed by
    R ^ S = R ^ ( t ) + 0.5 M 2 [ M ( H w 11 2 I w 12 + G w 22 ) + H ( H w 1 I w 2 ) l 30 + G ( G w 2 I w 1 ) l 03 + ( 3 H I w 1 + H G w 2 + 2 I 2 w 2 ) l 21 ] + 1 M ( H w 1 I w 2 ) κ 1 + ( G w 2 I w 1 ) κ 2 .
  • Under the LL function,
    -
    when g ( α , β ) = e p α , we observe that
    w 1 = p e p α , w 11 = p 2 e p α , w 2 = w 22 = w 12 = w 21 = 0 .
    The Bayes estimator of α can be obtained as
    α ^ L = 1 p log { e p α ^ + 0.5 M 2 M H w 11 + H 2 w 1 l 30 G I w 1 l 03 3 H I w 1 l 21 + 1 M ( H w 1 κ 1 I w 1 κ 2 ) } .
    -
    When g ( α , β ) = e p β , we can derive that
    w 2 = p e p β , w 22 = p 2 e p β , w 1 = w 11 = w 12 = w 21 = 0 .
    Similarly, the Bayes estimator of β is in the form of
    β ^ L = 1 p log { e p β ^ + 0.5 M 2 M G w 22 H I w 2 l 30 + G 2 w 2 l 03 + ( H G + 2 I 2 ) w 2 l 21 + 1 M ( I w 2 κ 1 + G w 2 κ 2 ) } .
    -
    Let g ( α , β ) = e p ( 1 t α ) β , we have
    w 1 = p β ( log t ) t α ( 1 t α ) β 1 e p ( 1 t α ) β , w 2 = p ( 1 t α ) β e p ( 1 t α ) β log ( 1 t α ) , w 11 = p β t α ( log t ) 2 ( 1 t α ) β 2 e p ( 1 t α ) β 1 β t α + p β t α ( 1 t α ) β , w 22 = p log ( 1 t α ) 2 ( 1 t α ) β e p ( 1 t α ) β 1 p ( 1 t α ) β , w 12 = w 21 = p ( log t ) t α ( 1 t α ) β 1 e p ( 1 t α ) β 1 + β log ( 1 t α ) p β ( 1 t α ) β log ( 1 t α ) .
    The Bayes estimator of R ( t ) are acquired as
    R ^ L = 1 p log { e p R ^ ( t ) + 0.5 M 2 [ M ( H w 11 2 I w 12 + G w 22 ) + H ( H w 1 I w 2 ) l 30 + G ( G w 2 I w 1 ) l 03 + ( 3 H I w 1 + ( H G + 2 I 2 ) w 2 ) l 21 ] + 1 M ( H w 1 I w 2 ) κ 1 + ( G w 2 I w 1 ) κ 2 } .
  • Under the EL function,
    -
    when g ( α , β ) = α q , we observe that
    w 1 = q α q 1 , w 11 = q ( q + 1 ) α q 2 , w 2 = w 22 = w 12 = w 21 = 0 .
    The Bayes estimator of α can be obtained as
    α ^ E = α ^ q + 0.5 M 2 M H w 11 + H 2 w 1 l 30 G I w 1 l 03 3 H I w 1 l 21 + 1 M ( H w 1 κ 1 U w 1 κ 2 ) 1 q .
    -
    When g ( α , β ) = β q , we can derive that
    w 2 = q β q 1 , w 22 = q ( q + 1 ) β q 2 , w 1 = w 11 = w 12 = w 21 = 0 .
    Analogously, the Bayes estimator of β is in the form of
    β ^ E = { β ^ q + 0.5 M 2 M G w 2 H I w 2 l 30 + G 2 w 2 l 03 + ( H G + 2 I 2 ) w 2 l 21 + 1 M ( I w 2 κ 1 + G w 2 κ 2 ) } 1 q .
    -
    Let g ( α , β ) = ( 1 t α ) q β , we have
    w 1 = q β t α ( 1 t α ) q β 1 log t , w 2 = q ( 1 t α ) q β log ( 1 t α ) , w 11 = q β t α ( log t ) 2 ( 1 t α ) q β 2 ( 1 + q β t α ) , w 22 = q 2 log ( 1 t α ) 2 ( 1 t α ) q β , w 12 = w 21 = q t α log t ( 1 t α ) q β 1 1 q β log ( 1 t α ) .
    The Bayes estimate of R ( t ) can be achieved by
    R ^ L = { R ^ ( t ) q + 0.5 M 2 [ M ( H w 11 2 I w 12 + G w 22 ) + H ( H w 1 I w 2 ) l 30 + G ( G w 2 I w 1 ) l 03 + ( 3 H I w 1 + ( H G + 2 I 2 ) w 2 ) l 21 ] + 1 M ( H w 1 I w 2 ) κ 1 + ( G w 2 I w 1 ) κ 2 } 1 q .

3.4. Tierney and Kadane Method

Besides Lindley’s approximation, Tierney and Kadane [20] proposed another approach to approximate integrals and Howlader and Hossain [21] made a comparison between these two methods of Pareto distribution under several diverse censored sample. Recall that the expectation of posterior of g ( α , β ) is
g ^ = E ( g ( α , β ) ) = 0 0 g ( α , β ) e l ( α , β | X ) + κ ( α , β ) d α d β 0 0 e l ( α , β | X ) + κ ( α , β ) d α d β = 0 0 e n δ g * ( α , β ) d α d β 0 0 e n δ ( α , β ) d α d β ,
where
δ ( α , β ) = l ( α , β | X ) + κ ( α , β ) n , δ g * ( α , β ) = δ ( α , β ) + log g ( α , β ) n ,
with κ ( α , β ) = log π ( α , β ) and l ( α , β | X ) denoting the likelihood funciton of α and β . Suppose that ( α ^ δ , β ^ δ ) represents the values of ( α , β ) which maximize δ ( α , β ) and ( α ^ δ * , β ^ δ * ) represent the values of ( α , β ) which maximize δ g * ( α , β ) . Thus, the approximation of the above Equation (15) is
g ^ = | Ω g * | | Ω | e n δ g * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
Here, | Ω | and | Ω g * | are the negatives of inverse Hessians of δ and δ g * respectively calculated at ( α ^ δ , β ^ δ ) as well as ( α ^ δ * , β ^ δ * ) . It is worth noting that | Ω | and δ ( α ^ δ , β ^ δ ) in Equation (16) do not rely on g, whereas | Ω g * | and δ g * ( α ^ δ * , β ^ δ * ) do rely on g. Below the Bayes estimators of parameter α , β and R ( t ) are acquired employing this method.
For the two-parameter Kumaraswamy distribution, we have
δ ( α , β ) = 1 n { D log α β + ( a 1 ) log α + α ( j = 1 D log x j b ) + ( c 1 ) log β + β j = 1 D ( 1 + R j ) log ( 1 x j α ) + E ( α ) d j = 1 D log ( 1 x j α ) j = 1 D log x j } .
Subsequently, the following set of equations need to be solved to get ( α ^ δ , β ^ δ ) :
δ α = 1 n D + a 1 α + β j = 1 D ( 1 + R j ) x j α log x j 1 x j α + E ( 1 ) ( α ) + j = 1 D log x j 1 x j α b = 0 , δ β = 1 n D + c 1 β + j = 1 D ( 1 + R j ) log ( 1 x j α ) + E ( α ) d = 0 .
Then, we obtain | Ω | , which is given by
| Ω | = 2 δ α 2 × 2 δ β 2 2 δ α β × 2 δ β α 1 ,
where
2 δ α 2 = 1 n { D + α 1 α 2 + β j = 1 D ( 1 + R j ) ( log x j ) 2 x j α ( 1 x j α ) 2 + E ( 2 ) ( α ) + j = 1 D x j α ( log x j ) 2 ( 1 x j α ) 2 } , 2 δ β 2 = D + c 1 n β 2 , 2 δ α β = 1 n j = 1 D ( 1 + R j ) x j α log x j 1 x j α + E ( 1 ) ( α ) .
Recall that expressions | Ω g * | and δ g * in Equation (16) rely on function g.
  • Under the SEL function,
    -
    g ( α , β ) = α for α and corresponding function δ * comes to be
    δ α * = δ + log α n .
    Later, we solve the set of following equations
    δ α * α = δ α + 1 n α = 0 , δ α * β = δ β = 0 ,
    and gain ( α ^ δ * , β ^ δ * ) . Then, | Ω α * | is computed as
    | Ω α * | = 2 δ α * α 2 × 2 δ α * β 2 2 δ α * α β × 2 δ α * β α 1 ,
    where
    2 δ α * α 2 = 2 δ α 2 1 n α 2 , 2 δ α * β 2 = 2 δ β 2 , 2 δ α * α β = 2 δ α * β α = 2 δ α β .
    Using the above expression in Equation (16), the desired Bayes estimator of α can be written as
    α ^ S = | Ω α * | | Ω | e n δ α * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
    The Bayes estimator of β based on the SEL function can be attained in similiar way.
    -
    Now, we consider the relibility function R ( t ) . Let g ( α , β ) = ( 1 t α ) β , then we have
    δ R t * = δ + β log ( 1 t α ) n .
    Hence, we calculate ( α ^ δ * , β ^ δ * ) by figuring out the following set of equations:
    δ R t * α = δ α β t α log t n ( 1 t α ) = 0 , δ R t * β = δ β + log ( 1 t α ) n = 0 .
    Subsequently, we deduce | Ω R t * | as follows:
    | Ω R t * | = 2 δ R t * α 2 × 2 δ R t * β 2 2 δ R t * α β × 2 δ R t * β α 1 ,
    where
    2 δ R t * α 2 = 2 δ α 2 β t α ( log t ) 2 n ( 1 t α ) 2 , 2 δ R t * β 2 = 2 δ β 2 , 2 δ R t * α β = 2 δ R t * β α = 2 δ α β t α log t n ( 1 t α ) .
    After that, the Bayes estimator of reliability function turns out to be
    R ^ S = | Ω R t * | | Ω | e n δ R t * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
  • Under the LL function,
    -
    g ( α , β ) = e p α for α and corresponding function δ * turns into
    δ α * ( α , β ) = δ ( α , β ) p α n .
    Later, we solve the set of following equations
    δ α * α = δ α p n = 0 , δ α * β = δ β = 0 ,
    and gain ( α ^ δ * , β ^ δ * ) . Then, | Ω α * | is computed as
    | Ω α * | = 2 δ α * α 2 × 2 δ α * β 2 2 δ α * α β × 2 δ α * β α 1 ,
    where
    2 δ α * α 2 = 2 δ α 2 , 2 δ α * β 2 = 2 δ β 2 , 2 δ α * β α = 2 δ α * α β = 2 δ α β .
    The Bayes estimator of α can be written as
    α ^ L = 1 p log | Ω α * | | Ω | e n δ α * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
    Likewise, based on the linex loss function, the Bayes estimator of β will be realized.
    -
    Now, we consider the relibility function R ( t ) . Let g ( α , β ) = e p ( 1 t α ) β , then we have
    δ R t * = δ p ( 1 t α ) β n .
    Hence, we calculated ( α ^ δ * , β ^ δ * ) by figuring out the following set of equations
    δ R t * α = δ α + p β t α ( 1 t α ) β 1 log t n = 0 , δ R t * β = δ β p ( 1 t α ) β log ( 1 t α ) n = 0 .
    Subsequently, we deduce | Ω R t * | as follows:
    | Ω R t * | = 2 δ R t * α 2 × 2 δ R t * β 2 2 δ R t * α β × 2 δ R t * β α 1 ,
    where
    2 δ R t * α 2 = 2 δ α 2 + p β t α ( log t ) 2 ( 1 t α ) β 2 ( 1 β t α ) n , 2 δ R t * β 2 = 2 δ β 2 p log ( 1 t α ) 2 ( 1 t α ) β n , 2 δ R t * α β = 2 δ R t * β α = 2 δ α β + p t α log t ( 1 t α ) β 1 1 + β log ( 1 t α ) n .
    After that, the Bayes estimator of R ( t ) is
    R ^ L = 1 p log | Ω R t * | | Ω | e n δ R t * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
  • Under the EL function,
    -
    g ( α , β ) = α q for α and corresponding function δ * becomes
    δ α * = δ q log α n .
    Later, we solve the set of following equations
    δ α * α = δ α q n α = 0 , δ α * β = δ β = 0 ,
    and gain ( α ^ δ * , β ^ δ * ) . Then, | Ω α * | is computed as
    | Ω α * | = 2 δ α * α 2 × 2 δ α * β 2 2 δ α * α β × 2 δ α * β α 1 ,
    where
    2 δ α * α 2 = 2 δ α 2 + q n α 2 , 2 δ α * β 2 = 2 δ β 2 , 2 δ α * β α = 2 δ α * α β = 2 δ α β .
    The desired Bayes estimator of α can be derived as
    α ^ E = | Ω α * | | Ω | e n q δ α * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .
    Obviously, under the EL function, we can also obtain the Bayes estimator of β .
    -
    Now, we consider the relibility function R ( t ) . Let g ( α , β ) = ( 1 t α ) q β , then we have
    δ R t * = δ q β log ( 1 t α ) n .
    Hence, we calculate ( α ^ δ * , β ^ δ * ) by figuring out the following set of equations:
    δ R t * α = δ α + q β t α log t n ( 1 t α ) = 0 , δ R t * β = δ β q log ( 1 t α ) n = 0 .
    Subsequently, we deduce | Ω R t * | as follows:
    | Ω R t * | = 2 δ R t * α 2 × 2 δ R t * β 2 2 δ R t * α β × 2 δ R t * β α 1 ,
    where
    2 δ R t * α 2 = 2 δ α 2 + q β t α ( log t ) 2 n ( 1 t α ) 2 , 2 δ R t * β 2 = 2 δ β 2 , 2 δ R t * β α = 2 δ R t * α β = 2 δ α β + q t α log t n ( 1 t α ) .
    After that, the Bayes estimator of R ( t ) results to be
    R ^ L = | Ω R t * | | Ω | e n q δ R t * ( α ^ δ * , β ^ δ * ) δ ( α ^ δ , β ^ δ ) .

4. Simulation Study

Within the simulation experiment, generating the generalized progressive hybrid censoring sample is our first step. Before continuing further, we give the way to generate progressive Type-II censoring sample in accordance with Balakrishnan and Sandhu [22]. He presented a simple but effective simulational algorithm, which enables one to collect a series of progressive Type-II censored sample out of any continuous distribution. By adapting this method, we give the algorithm of generating the generalized progressive hybrid censoring sample as below.
Step-1: Generate m independent observations W 1 , W 2 , , W m which each follow the standard uniform distribution.
Step-2: Set Z i = W i 1 i + R m + + R m i + 1 , for i = 1 , , m .
Step-3: Set Y i = 1 Z m Z m i + 1 , for i = 1 , , m . Then, Y 1 , , Y m are the desired progressive Type-II censored sample which comes from the standard uniform distribution.
Step-4: At last we set X i = F 1 ( Y i ) , for i = 1 , , m , where F 1 ( Y i ) stands for the inverse culmulative density function of any distribution considered. X 1 , X 2 , , X m are the desired progressive Type-II censored sample out of the distribution F ( ) . In this article, F ( ) is the Kumaraswamy distribution.
Step-4.1: When T < X k < X m , the generalized progressive hybrid censored sample are ( X 1 , X 2 , , X k ) (i.e., Case I);
Step-4.2: When X k < T < X m , J can be determinde which makes X J < T < X J + 1 , and the generalized progressive hybrid censored sample is ( X 1 , X 2 , , X J ) (i.e., Case II);
Step-4.3: When X k < X m < T , the generalized progressive hybrid censored sample is ( X 1 , X 2 , , X m ) (i.e., Case III).
Without loss of generality, it is found to use the Kumaraswamy distribution which takes the value of α = 3, β = 2 and T = 0.9. The process is replicated 1000 times in each case. The results have been obtained under diverse (n, m, k). The censoring schemes employed in this simulation are
Scheme I : R 1 = n m , R 2 = = R m = 0 , Scheme II : R 1 = = R m 1 = 0 , R m = n m , Scheme III : R 1 = ( n m ) / 2 , R 2 = = R m 1 = 0 , R m = ( n m ) / 2 .
We compute both the MLE and Bayes estimates. Bayes estimates are respectively on the condition of the non-informative prior distribution, which means that four hyperparameters a, b, c, d adopt values of 0, and informative prior distribution, where a = 1.5, c = 1, and b = d = 0.5. Bayes estimates are obtained under loss function subject to SEL, LL and EL functions. As for linex loss function, relative estimates are gained with p = 0.2, 0.8. General entropy loss function is considered with q = 0.2, 0.8. Finally, we use mean squared errors (MSEs), as well as absolute biases (ABs), to evaluate the accuracy of the estimations. They are
M S E s = 1 S i = 1 S ( σ i ^ σ ) 2 , A B s = 1 S i = 1 S | σ i ^ σ | .
where σ is the true value, σ ^ stands for the corresponding estimate, and S represents the simulation times. The simulation results, Table A1Table A3, are shown in the Appendix A. In every two rows of Table A1Table A3, the upper one is MSEs and the lower one is ABs. We put the corresponding MSEs and ABs of MLE in the fifth column. Besides, all other columns are composed of eight values. The first two values represent the MSEs and ABs in light of Lindley approximation with noninformative prior distribution. The second two values represent the MSEs and ABs using Lindley approximation with informative prior distribution. The third two values denote the MSEs and ABs using TK method with noninformative prior distribution. The fourth two values correspond to the MSEs and ABs using TK method with informative prior distribution.
In Table A1, it can be seen that with the sample size n increasing, MSEs decrease, since as n increases, more additional information is gathered. For a given sample size, MSEs also decline with the generalized progressive hybrid censored sample R i decreasing, namely m increasing. Further, with n and m kept constant, the MSEs decline as the accepted bare minimum of failures k increases. In general, as for MSEs, Bayes estimates are superior to the MLEs, and there is no significant difference among the three schemes. Particularly, respective Bayes estimates of α under SEL make a better performance compared to the corresponding LL and EL. Besides, the Bayes estimates with informative prior provide better performance compared with the respective MSEs with noninformative prior. Estimation under LL with p = 0.2 gives a better option than p = 0.8, while q = 0.2 tends to produce lower MSEs than q = 0.8 under EL. Overall, with proper prior information, the Bayes estimates based on the SEL function and TK method do better than other estimates for α .
In Table A2, it can be seen that with the sample size n increasing, MSEs decrease, since as n increases, more additional information is gathered. For a given sample size, MSEs also decline with the generalized progressive hybrid censored sample R i decreasing, namely m increasing. Further, with n and m kept constant, the MSEs decline as the accepted bare minimum of failures k increases. In general, as for MSEs, Bayes estimates are superior to the MLEs, and there is no significant difference among the three schemes. Particularly, respective Bayes estimates of β under LL perform better than the corresponding SEL, as well as EL. Besides, the Bayes estimates with informative prior provide better performance compared with the respective MSEs with noninformative prior. Estimation under LL with p = 0.2 gives a better option than p = 0.8, while q = 0.2 tends to produce lower MSEs than q = 0.8 under EL. As for β , the Bayes estimates with proper prior based on the lines loss function and Lindley method perform better than other estimates.
In Table A3, it can be seen that with the sample size n increasing, MSEs decrease, since as n increases, more additional information is gathered. For a given sample size, MSEs also decline with the generalized progressive hybrid censored sample R i decreasing, namely m increasing. Further, with n and m kept constant, the MSEs decline as the accepted bare minimum of failures k increases. The maximum likelihood estimates outperform noninformative Bayes estimates, and there is no significant difference among the three schemes. Particularly, respective Bayes estimates of R(t) under LL perform worse than the corresponding SEL, as well as EL. Besides, the Bayes estimates with informative prior provide better performance compared with the respective MSEs with noninformative prior. In the case of LL, the choice p = 0.8 is preferred, while in the case of EL, the option q = 0.2 produces better results.

5. Data Analysis

This part illustrates a real-life dataset, aiming at analyzing the validity of the presented estimation methods. The following dataset shows the monthly water capacity of the Shasta reservoir in the time range of August and December from 1975 to 2016. These data have been used before by some statisticians, such as Kohansal [23]. Because K ( α , β ) is defined for 0 < x < 1 , all data are divided by the total capacity of Shasta reservoir 4,552,000 acre-foot. The transformed data are tabulated in Table 1 and presented as a histogram in Figure 4.
One may doubt whether the considered dataset comes from the Kumaraswamy distribution or not. To verify the reasonableness, we fit the Kumaraswamy distribution to the dataset, competing with two other distributions—exponentiated exponential distribution (EED) and Lomax distribution (LD). The corresponding cdf and pdf are given below.
( E E D ) c d f : F ( x ) = ( 1 e β x ) α , α , β , x > 0 , p d f : f ( x ) = α β ( 1 e β x ) α 1 e β x , α , β , x > 0 , ( L D ) c d f : F ( x ) = 1 ( 1 + x β ) α , α , β , x > 0 , p d f : f ( x ) = α β ( 1 + x β ) α + 1 , α , β , x > 0 .
We employ the Akaike’s information criterion (AIC), defined by 2 l n ( L ) + 2 k , the associated second order criterion (AICc), defined by A I C + 2 k ( k + 1 ) n k 1 and Bayesian information criterion (BIC), defined by 2 l n ( L ) + k l n ( n ) , where L is maximized value of the likelihood function, k is the number of the parameters, n is the number of observations, as well as Kolmogorov–Smirnov (K–S) statistics with its p-value. The results are shown in Table 2. Besides, the empirical cumulative distribution functions and the quantile–quantile plots are given respectively in Figure 5 and Figure 6.
Considering that the Kumaraswamy distribution has lower AIC, AICc, BIC and K–S statistics and higher p-value, it is reasonable to say we fail to reject the hypothesis that the data come from Kumaraswamy.
The following data is generalized progressive hybrid censored sample artificially by using m = 21 and R 1 = R 2 = = R 21 = 1. The ordered progressive Type-II censored dataset is given in Table 3.
In this illustration, take T = 0.75 and k = 19 for Case I, T = 0.75 and k = 14 for Case II, as well as T = 0.9 and k = 14 for Case III. Table 4 presents the estimation of α , β , as well as the reliability function of the generalized progressive hybrid censored sample.

6. Conclusions

In this paper, we estimate two parameters and the reliability of the Kumaraswamy distribution when the dataset is sampled in a generalized progressive hybrid censoring scheme. The maximum likelihood estimators are inferred. In addition, the Bayes estimators are achieved by the Lindley method, as well as the TK method, based on general entropy, squared error and linex loss function, which are all conducted in light of noninformative and informative priors. The performance of the estimates is contrasted according to absolute bias values and mean squared error. The Bayes estimates of proper informative prior are revealed to work better than corresponding noninformative prior estimates. Besides, Bayes estimates produce lower MSEs for two parameters, while for reliability estimation MLEs have lower values instead. A real-life example is also intensively investigated. In the future, more in-depth researches are worth discussing, such as the use of bayesian estimation for the Kumaraswamy distribution in non-linear regression models (Contreras-Reyes et al. [1]).

Author Contributions

Investigation, J.T.; Supervision, W.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Project 202110004001 which was supported by National Training Program of Innovation and Entrepreneurship for Undergraduates.

Acknowledgments

We are grateful to the two referees and the editor for their careful reading and their constructive comments which leads to this greatly improved paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Simulation Results

Table A1. Mean squared errors (MSEs) and absolute biases (ABs) of different estimates of α when T = 0.9.
Table A1. Mean squared errors (MSEs) and absolute biases (ABs) of different estimates of α when T = 0.9.
nmkSchMLESELLLEL
p = 0 . 2 p = 0 . 8 q = 0 . 2 q = 0 . 8
807220I0.1957540.1732740.1771780.1839810.1810160.183588Lindley
0.3429060.3307040.3451710.3555940.3517180.355226
0.1560180.1728960.1737370.1711600.175990
0.3127900.3433100.3497110.3403080.347543
0.1797240.1671200.1728720.1779110.180735TK
0.3375210.3269010.3298480.3522450.354449
0.1456040.1503270.1525850.1566600.163717
0.3029970.3144280.3144870.3126540.320628
II0.1844180.1647410.1705170.1899430.1765380.184214Lindley
0.3346400.3238620.3360090.3591540.3421290.356780
0.1542960.1656110.1733920.1680590.175020
0.3103590.3292020.3441550.3398850.342755
0.1737300.1738360.1745740.1798250.181395TK
0.3320190.3275410.3387720.3509350.352446
0.1485410.1522960.1535980.1548830.155824
0.3076140.3119190.3138860.3162950.319657
III0.1868390.1677150.1789620.1828960.1729830.178113Lindley
0.3317940.3302350.3508820.3583510.3440870.349435
0.1503820.1669240.1722020.1700600.171719
0.3063410.3370700.3455530.3408780.341281
0.1667490.1683510.1689380.1742070.180959TK
0.3265040.3307830.3293930.3410920.351508
0.1478510.1481880.1525160.1502730.153908
0.3040590.3094180.3158510.3091590.314561
30I0.1753590.1670310.1729490.1780080.1661540.173000Lindley
0.3319460.3233130.3390120.3435640.3381220.344992
0.1521940.1649580.1696520.1664300.168725
0.3103470.3384820.3436210.3354880.339530
0.1664000.1666380.1723910.1714380.177878TK
0.3208470.3244530.3336540.3407590.344098
0.1439900.1492070.1500380.1531380.155757
0.3059190.3129880.3201390.3172260.318324
II0.1809620.1613110.1639800.1811750.1748750.181670Lindley
0.3302920.3223500.3299870.3522160.3443000.349224
0.1523420.1619240.1672770.1623670.165248
0.3106900.3339650.3368070.3321940.337640
0.1683750.1602200.1621630.1723000.176658TK
0.3265500.3211930.3265770.3433920.344013
0.1483640.1503610.1516630.1511770.153493
0.3087730.3123010.3132430.3141560.316487
III0.1850890.1651990.1685580.1774340.1707700.175226Lindley
0.3387520.3194090.3391890.3500660.3413160.342927
0.1445930.1650950.1705560.1649920.170073
0.3031430.3365060.3430160.3362260.342143
0.1659390.1573910.1582620.1689730.171244TK
0.3248600.3196670.3255120.3383780.339089
0.1419800.1424200.1515840.1480920.153502
0.3001930.3037450.3141580.3098720.312044
7620I0.1663000.1625600.1674330.1760580.1736860.177993Lindley
0.3174720.3261340.3369840.3488430.3458150.348391
0.1479730.1584800.1614290.1633950.166470
0.3038590.3290010.3286220.3306090.334530
0.1650710.1589740.1618260.1708860.177120TK
0.3244150.3217720.3254160.3366100.352839
0.1427960.1434060.1470270.1494050.151933
0.3018570.3068380.3111060.3081660.316401
II0.1715470.1611240.1633680.1734010.1717810.174262Lindley
0.3277830.3225330.3294790.3424090.3418830.348672
0.1504410.1564680.1630990.1565840.160757
0.3154800.3283700.3336250.3233300.334570
0.1626110.1501120.1570150.1690740.172704TK
0.3195560.3128560.3226240.3383470.341468
0.1425400.1464550.1495060.1469970.152814
0.2993160.3087490.3149810.3069330.309555
III0.1706920.1592870.1685630.1738560.1699450.172634Lindley
0.3236450.3179320.3396180.3463880.3345630.341489
0.1423410.1542560.1651340.1586540.162581
0.2992010.3270050.3334470.3267160.333624
0.1615540.1512120.1579690.1663360.169676TK
0.3135440.3134820.3190930.3352680.336643
0.1362500.1407440.1480010.1395210.143761
0.2929270.2983670.3176340.2958630.304371
30I0.1650080.1559580.1662950.1738850.1663570.167108Lindley
0.3179730.3173330.3353550.3475270.3333990.337807
0.1430720.1540500.1610610.1625490.164274
0.3017360.3172480.3336380.3341900.337059
0.1594160.1576800.1596250.1665770.174166TK
0.3159040.3185980.3222300.3360440.351277
0.1331930.1362370.1404670.1462100.148110
0.2897550.2975560.3040690.3062960.309866
II0.1633850.1604610.1614300.1731940.1676630.171512Lindley
0.3234270.3151280.3219180.3443710.3304630.343644
0.1457630.1505870.1610330.1560150.158450
0.3066580.3226620.3334480.3282420.325840
0.1576940.1462250.1567180.1677530.171009TK
0.3111930.3036770.3192650.3371010.341809
0.1419540.1419540.1462350.1441390.149823
0.3024600.3022180.3066500.3013720.308069
III0.1698810.1557220.1625020.1685950.1585100.166633Lindley
0.3280020.3158690.3324900.3369670.3280020.332975
0.1400300.1511510.1639560.1574480.158148
0.3045850.3208720.3352530.3287410.329517
0.1521950.1468310.1538710.1590820.164530TK
0.3107050.3029800.3156900.3329100.333448
0.1331410.1369990.1453940.1375700.141221
0.2929110.2964660.3059830.2958330.304130
1008230I0.1584960.1479050.1512990.1570980.1560280.158770Lindley
0.3167080.3023700.3156200.3280980.3187890.327788
0.1334930.1469500.1565030.1483820.149195
0.2848530.3169060.3255880.3163390.317492
0.1431070.1360200.1404880.1488480.154655TK
0.3033050.2878680.3043980.3151950.324310
0.1275380.1285230.1307930.1334310.136239
0.2869520.2905970.2969080.2882470.295697
II0.1605410.1368890.1480140.1509840.1420430.150085Lindley
0.3118130.2931700.3145100.3208540.3112510.321959
0.1206950.1332820.1438250.1388590.148270
0.2800280.3019340.3098660.3094630.320493
0.1356060.1355650.1399010.1428890.149675TK
0.2917780.2946040.3010280.3126680.321747
0.1204010.1213880.1294630.1283320.137610
0.2765030.2789550.2902720.2834030.295037
III0.1619750.1394710.1487700.1526760.1419630.154038Lindley
0.3165430.3002360.3192340.3245890.3094420.320746
0.1324320.1483930.1544120.1470610.153666
0.2877580.3201920.3271410.3181510.324176
0.1473900.1361990.1416530.1547290.163258TK
0.3041590.2928400.2997600.3283070.336531
0.1215360.1235860.1293190.1227340.128961
0.2775210.2812090.2899050.2804880.289221
45I0.1560020.1391350.1456390.1525800.1531240.156099Lindley
0.3104010.2961420.3103950.3225770.3184290.325840
0.1303480.1424670.1484270.1419100.143547
0.2870300.3120380.3150990.3097750.311625
0.1366990.1315940.1395900.1482930.149381TK
0.2964390.2897840.3066260.3174000.315475
0.1210020.1249900.1292550.1270680.130692
0.2810130.2820240.2917310.2916060.290489
II0.1478470.1346920.1426680.1455240.1350650.147970Lindley
0.3015140.2914270.3072420.3172480.3001490.318908
0.1191970.1257360.1350940.1354430.142335
0.2743290.2921370.3021340.3038150.310954
0.1349480.1289390.1310730.1401550.141756TK
0.2950570.2833830.2932550.3103080.310028
0.1182390.1195400.1219860.1253530.126930
0.2712420.2794810.2806130.2847350.285185
III0.1473740.1319650.1468770.1574590.1407130.145988Lindley
0.3038090.2906470.3170340.3297170.3090250.316835
0.1252440.1447640.1534040.1438730.148504
0.2824340.3160830.3239140.3106760.318113
0.1395220.1334690.1401160.1537830.156486TK
0.2959720.2904250.3016870.3262630.330464
0.1196130.1215150.1226950.1228910.124598
0.2778870.2805730.2849170.2836180.284538
9430I0.1325030.1237970.1435830.1449920.1445890.136430Lindley
0.2850010.2849990.3095130.3124950.3091850.302944
0.1200960.1245110.1321110.1237630.130444
0.2738840.2888970.2988150.2838690.298420
0.1267250.1306600.1334710.1392680.148050TK
0.2805120.2956380.2926240.3040140.313696
0.1141780.1151880.1211560.1174580.118981
0.2663580.2756870.2773390.2761020.272325
II0.1358510.1247720.1356980.1422220.1293420.132362Lindley
0.2923910.2854860.3013560.3104670.2943030.293817
0.1150230.1208290.1289500.1321060.139177
0.2614260.2839540.2922670.2995270.307270
0.1328540.1269230.1280810.1329560.137500TK
0.2913740.2844870.2914590.2940280.303618
0.1130340.1168620.1204600.1185380.121378
0.2678370.2727530.2818790.2771290.280367
III0.1300140.1297530.1334930.1399570.1319390.132621Lindley
0.2806000.2846240.2974180.3072680.2952530.298255
0.1236720.1236140.1374550.1246100.130591
0.2817590.2878290.3047450.2914290.294189
0.1234870.1274270.1325970.1318220.133141TK
0.2784010.2853100.2895930.2983480.299885
0.1149750.1159720.1202930.1197450.121723
0.2675770.2725090.2814830.2768800.281057
45I0.1296080.1219160.1273240.1377030.1357860.139453Lindley
0.2854120.2752370.2913900.3051260.3006800.306916
0.1144740.1206660.1286960.1158190.123577
0.2715500.2888970.2951480.2766650.285389
0.1216480.1225350.1242500.1354000.142578TK
0.2743300.2784880.2849250.3035170.311050
0.1088820.1118580.1147310.1166790.117557
0.2651010.2697150.2718300.2755390.274517
II0.1304960.1234060.1304760.1382310.1248090.130446Lindley
0.2833340.2816100.2961440.3048850.2899330.294102
0.1113790.1191550.1255780.1296030.133119
0.2667710.2813770.2882200.2986220.303524
0.1265750.1221760.1279580.1292040.132621TK
0.2785670.2801390.2873260.2930370.296470
0.1091820.1122050.1159130.1172740.120175
0.2634430.2706900.2712200.2732810.274126
III0.1318540.1203130.1266450.1360010.1273020.127935Lindley
0.2910570.2754730.2937140.3070070.2923450.292044
0.1115210.1209940.1256530.1234630.126743
0.2651860.2841670.2912240.2859980.291452
0.1218730.1218600.1248480.1263310.129101TK
0.2808690.2823870.2846580.2864540.293051
0.1135720.1139620.1149600.1133530.118752
0.2694840.2676840.2719990.2658570.278690
Table A2. MSEs and ABs of different estimates of β when T = 0.9.
Table A2. MSEs and ABs of different estimates of β when T = 0.9.
nmkSchMLESELLLEL
p = 0 . 2 p = 0 . 8 q = 0 . 2 q = 0 . 8
807220I0.1959610.1674740.1259220.1293320.1703240.179334Lindley
0.3366240.3168460.2951140.2955210.3452110.356905
0.1515570.1129420.1256580.1629860.168371
0.2998410.2809490.2956380.3359100.341832
0.1611330.1439600.1471420.1541700.155740TK
0.3123850.3012010.3061550.3105560.315511
0.1460080.1300930.1317610.1347780.147166
0.2953750.2856220.2906240.2935040.302999
II0.1986160.1693680.1180200.1229830.1635280.169168Lindley
0.3326290.3136570.2861550.2882210.3364060.344660
0.1442500.1103260.1211250.1581150.166199
0.2962630.2716210.2912580.3318320.340948
0.1607010.1455190.1477510.1466220.151650TK
0.3077950.2968780.3021540.2959880.312282
0.1436570.1258740.1276910.1310210.136678
0.2954700.2794710.2862670.2897940.294870
III0.1997140.1639900.1141820.1257460.1604210.165573Lindley
0.3342810.3119470.2804640.2997060.3304280.342646
0.1558750.1095290.1166440.1555760.161524
0.3033720.2764850.2852010.3310010.339624
0.1698710.1503670.1584230.1687510.171984TK
0.3159430.3039360.3078540.3109150.323535
0.1612580.1283930.1294830.1384330.140673
0.3080450.2832210.2844670.2934840.290108
30I0.1800810.1647700.1275940.1279180.1644460.171331Lindley
0.3265010.3211660.2950310.2968430.3405400.339008
0.1482620.1113360.1171810.1616420.162909
0.2968600.2761690.2843300.3355750.336046
0.1602610.1423070.1482500.1466280.149070TK
0.3125410.3046250.3054310.3030110.308049
0.1459590.1227220.1270270.1267770.131688
0.2964220.2828620.2869210.2806700.289439
II0.1946450.1574880.1153800.1196280.1585450.168903Lindley
0.3301640.3089370.2769240.2893720.3294070.343435
0.1365080.1086720.1146230.1529090.160779
0.2879500.2693030.2859360.3235210.333769
0.1593160.1368430.1373290.1416160.150826TK
0.3077740.2941940.2945850.2978990.299668
0.1375080.1214490.1245890.1284330.132614
0.2902930.2788600.2850390.2872890.286509
III0.1853160.1599780.1107080.1198490.1589340.161440Lindley
0.3261100.3048470.2776430.2908840.3305000.336606
0.1454120.1051900.1121860.1545880.157105
0.2934780.2686080.2760840.3281980.333355
0.1562140.1413670.1510790.1497740.155430TK
0.3057810.2943120.3032190.3033430.306630
0.1464180.1273050.1285220.1368880.139216
0.2908970.2861550.2853020.2900730.299831
7620I0.1662910.1562980.1184400.1247110.1636350.165813Lindley
0.3136050.3099530.2843410.2939970.3376210.339916
0.1414130.1066780.1129720.1560460.159913
0.2908610.2707630.2775840.3290640.331369
0.1598370.1375890.1445620.1447890.145483TK
0.3074060.2901100.3049500.2973340.303843
0.1344030.1209960.1247880.1255740.127552
0.2958130.2805100.2776480.2869780.282143
II0.1711100.1578520.1132670.1150850.1599680.163557Lindley
0.3122180.3025110.2768170.2781770.3335920.335084
0.1334130.1031280.1065240.1500060.153817
0.2830530.2642800.2690330.3258110.326850
0.1588360.1281500.1350090.1334050.145423TK
0.3033720.2825410.2954190.2858410.296925
0.1296180.1197020.1222640.1234520.128127
0.2802950.2775710.2792100.2800920.285718
III0.1730080.1515090.1117200.1132880.1542590.156918Lindley
0.3153900.2981150.2717310.2776970.3260290.330195
0.1348630.1042340.1111530.1488660.153811
0.2795960.2681180.2775660.3199420.321904
0.1479080.1262860.1315590.1432540.153825TK
0.2985400.2789690.2907510.2923730.310053
0.1375440.1207270.1235370.1237100.126293
0.2844690.2737050.2797400.2803180.285400
30I0.1610440.1530090.1167630.1190380.1599440.162030Lindley
0.3114600.3011800.2814340.2841810.3301280.333800
0.1351990.1020430.1118670.1541400.159295
0.2932650.2651560.2756080.3282990.332644
0.1583010.1367160.1410350.1423550.145560TK
0.3039300.2928810.3021390.2973200.304800
0.1335850.1158980.1238670.1238950.124198
0.2879510.2726500.2850000.2825250.286325
II0.1617950.1537710.1112000.1145670.1588890.161932Lindley
0.3121430.2993320.2726510.2798350.3303890.333344
0.1306380.1010220.1059200.1445420.151853
0.2812310.2626910.2683490.3104550.321634
0.1539260.1279730.1293490.1295770.137293TK
0.3035760.2872720.2905640.2807170.292569
0.1285460.1158520.1199880.1220940.127185
0.2765830.2713270.2786170.2793570.287922
III0.1611820.1519340.1074910.1136100.1523460.155006Lindley
0.3017640.2990760.2697350.2784730.3223700.326885
0.1295300.1039480.1066400.1402390.147796
0.2819680.2652230.2721020.3090170.320355
0.1465210.1250520.1333390.1362190.151772TK
0.3016410.2850290.2913210.2910230.301340
0.1347880.1182890.1233770.1203590.124732
0.2822840.2771660.2859470.2698540.281490
1008230I0.1533100.1453010.1096630.1167880.1495520.156306Lindley
0.2993690.2967250.2744910.2818030.3213170.328219
0.1291550.0997310.1089940.1460490.147860
0.2799220.2628590.2755370.3148550.315498
0.1476230.1264850.1309510.1285310.131542TK
0.2928220.2834610.2884920.2883710.290200
0.1256850.1151850.1183930.1174380.121412
0.2781300.2703710.2775220.2711970.281743
II0.1545740.1406260.1003250.1085000.1442460.145797Lindley
0.2995830.2833910.2639690.2737160.3138230.315593
0.1272600.0961690.1018760.1356260.140571
0.2757500.2580340.2632460.3084200.314007
0.1384030.1226400.1252270.1275640.128041TK
0.2871580.2768620.2799970.2816490.283290
0.1245150.1142300.1168400.1213980.126292
0.2770290.2664710.2776770.2777250.278306
III0.1674410.1427440.1047750.1086590.1490720.154847Lindley
0.3079120.2898450.2719040.2761280.3267060.333213
0.1251830.0962110.1040540.1385820.142786
0.2737300.2549770.2686930.3122990.321215
0.1451170.1166170.1226100.1297430.136671TK
0.2885900.2678950.2809750.2848860.297167
0.1321320.1177120.1223370.1182270.125430
0.2824360.2611770.2780520.2686080.283806
45I0.1435000.1361300.1023790.1127110.1458970.151619Lindley
0.2943350.2957570.2622400.2778920.3147990.322754
0.1200290.0962140.1065960.1416390.147179
0.2709960.2580520.2729520.3151860.316920
0.1373850.1252360.1289880.1231610.125961TK
0.2924690.2799820.2831970.2733770.280845
0.1213840.1128810.1146190.1128840.117658
0.2728900.2652960.2706590.2654130.273428
II0.1497130.1370680.1003650.1046340.1353970.139903Lindley
0.2961910.2898840.2581840.2675170.3005540.313164
0.1253900.0884890.1000220.1291760.137732
0.2735560.2450520.2636980.2966080.308541
0.1348710.1167860.1236370.1249780.127771TK
0.2837110.2709750.2829270.2772710.283818
0.1231390.1116090.1135920.1178520.123638
0.2704580.2650150.2711830.2664310.277810
III0.1621690.1406100.1039150.1098190.1430200.151550Lindley
0.3062680.2880110.2655630.2790200.3229230.326686
0.1231270.0940630.0977450.1372140.140252
0.2746630.2534120.2609850.3069140.318428
0.1275410.1166580.1191590.1282630.135881TK
0.2789440.2654940.2785740.2880840.290577
0.1249700.1119770.1196780.1181570.123543
0.2784750.2690410.2765530.2695950.277491
9430I0.1452920.1285370.1011650.1039130.1306340.144583Lindley
0.2953100.2789370.2602410.2674770.3015400.315142
0.1165430.0876050.0937520.1214080.125169
0.2655390.2411990.2539600.2889680.292778
0.1268140.1110040.1196860.1153470.117204TK
0.2737660.2622430.2728920.2665990.270314
0.1154670.1001480.1030700.1065090.107580
0.2683780.2522600.2581830.2608900.262149
II0.1356280.1213360.0942650.0988330.1244860.130945Lindley
0.2780240.2711590.2543100.2560690.2907250.301405
0.1161630.0820770.0949490.1187680.123681
0.2654340.2312960.2552560.2854160.291467
0.1223090.1138030.1229460.1092960.111378TK
0.2737990.2666100.2769050.2615150.267054
0.1103010.1036680.1050350.1038770.109838
0.2583390.2540950.2589280.2582670.262114
III0.1371060.1269330.0880830.0912240.1301960.133326Lindley
0.2795030.2739170.2449210.2452880.2975810.306710
0.1067810.0851600.0891380.1167800.123435
0.2548960.2397040.2478360.2791060.293215
0.1216160.1126600.1132230.1091450.115564TK
0.2707900.2667250.2661410.2624360.265160
0.1169040.1063760.1079090.1121160.113828
0.2672350.2564900.2570010.2582030.260865
45I0.1384480.1077880.0976260.1019910.1269680.134152Lindley
0.2779450.2500810.2560380.2635440.2933150.298622
0.1098240.0848940.0915130.1203310.121207
0.2586060.2405290.2475690.2876170.291623
0.1181660.1083890.1113680.1098020.113726TK
0.2681840.2634300.2690030.2586990.274274
0.0924680.0960690.1013310.0998360.105263
0.2420370.2477730.2542650.2561410.256763
II0.1323780.1095930.0925660.0996960.1146510.127986Lindley
0.2802570.2661840.2465360.2599820.2778770.297383
0.1021150.0801370.0876350.1161770.117395
0.2512390.2328380.2428000.2772060.280823
0.1162140.1086900.1114290.1067000.108722TK
0.2640740.2592760.2662960.2589250.260910
0.0982650.0922580.0989590.1022960.107582
0.2487200.2415300.2540060.2481000.258353
III0.1287640.1060790.0873200.0887290.1183430.121471Lindley
0.2755810.2538200.2414520.2408440.2805480.288014
0.1045810.0839020.0884120.1109730.119573
0.2515430.2368430.2469400.2789840.285724
0.1090230.1095900.1123050.1148690.112612TK
0.2558560.2576750.2703490.2689820.268540
0.1052430.1044140.1051860.1076330.109181
0.2527130.2549900.2595150.2575800.262889
Table A3. MSEs and ABs of different estimates of R ( t ) when T = 0.9.
Table A3. MSEs and ABs of different estimates of R ( t ) when T = 0.9.
nmkSchMLESELLLEL
p = 0 . 2 p = 0 . 8 q = 0 . 2 q = 0 . 8
807220I0.0006800.0007320.0009480.0009290.0008050.000747Lindley
0.0212210.0213000.0234000.0236930.0217640.021364
0.0006610.0009100.0008700.0007600.000709
0.0200590.0235490.0227330.0212040.020385
0.0009360.0007860.0007580.0007650.000728TK
0.0235620.0218010.0215010.0212730.020447
0.0008720.0007220.0007050.0006900.000678
0.0230000.0212680.0209110.0205310.020289
II0.0006750.0007150.0009560.0009240.0007660.000712Lindley
0.0206440.0207000.0239290.0234280.0212230.020658
0.0006890.0008310.0008190.0006830.000621
0.0204020.0224670.0218210.0203600.019234
0.0008700.0007280.0007240.0007420.000701TK
0.0227840.0212900.0207050.0205600.020320
0.0007750.0006930.0006890.0006800.000673
0.0217750.0205970.0206470.0203720.020930
III0.0006650.0007020.0008690.0007830.0007370.000711Lindley
0.0204510.0208310.0229920.0222090.0208810.020408
0.0006660.0008150.0007770.0007360.000668
0.0201260.0223290.0216220.0209290.019818
0.0008010.0007140.0006830.0007800.000737TK
0.0222310.0209290.0208020.0217670.021373
0.0007810.0006620.0006540.0007310.000671
0.0219060.0202120.0203320.0214280.020791
30I0.0006770.0007250.0009240.0009020.0007710.000696Lindley
0.0207290.0208280.0234480.0233930.0213210.020477
0.0006520.0009040.0008280.0007450.000638
0.0196740.0234430.0224390.0209890.018967
0.0008600.0007760.0007530.0007170.000681TK
0.0226880.0215430.0215780.0204610.019850
0.0008220.0007050.0006830.0006880.000635
0.0220040.0209320.0206100.0207100.019971
II0.0006730.0007090.0009210.0009020.0007620.000701Lindley
0.0208540.0208030.0233530.0231910.0215670.020409
0.0006750.0008140.0008030.0006480.000613
0.0203210.0221450.0222060.0193190.019369
0.0007890.0007090.0006960.0007330.000673TK
0.0216960.0207000.0205760.0206670.020036
0.0007470.0006830.0006650.0006660.000658
0.0213550.0204730.0200970.0204870.020202
III0.0006600.0007010.0008630.0007860.0007050.000694Lindley
0.0202890.0207020.0227390.0219650.0204270.020374
0.0006450.0008000.0007760.0006670.000649
0.0199350.0220840.0217850.0192990.018283
0.0007610.0006950.0006510.0007290.000698TK
0.0216800.0205920.0195120.0214660.021139
0.0007130.0006580.0006490.0006820.000650
0.0210650.0201730.0199050.0207160.020204
7620I0.0006530.0006940.0009040.0008320.0007680.000655Lindley
0.0203150.0210770.0234450.0223380.0215490.020125
0.0006340.0008400.0008240.0007090.000611
0.0195840.0225080.0222940.0206150.018883
0.0007820.0007270.0006970.0007020.000671TK
0.0223180.0213860.0207110.0207480.019608
0.0008010.0006660.0006390.0006670.000626
0.0221490.0204310.0194310.0204380.019762
II0.0006440.0007050.0008550.0008020.0007250.000682Lindley
0.0197460.0210680.0224870.0220450.0209590.020161
0.0006410.0007680.0007570.0006360.000610
0.0196070.0214130.0214870.0192370.019280
0.0007700.0006810.0006780.0007240.000653TK
0.0217660.0204000.0204850.0204780.019413
0.0007380.0006670.0006410.0006520.000623
0.0214860.0202140.0200090.0203270.019468
III0.0006210.0006710.0008470.0007520.0006930.000680Lindley
0.0198750.0198820.0225300.0213270.0202940.020042
0.0006350.0007760.0007400.0006650.000594
0.0196210.0217990.0211470.0201000.018962
0.0007490.0006990.0006780.0006870.000653TK
0.0212230.0209060.0205280.0209290.020301
0.0007080.0006460.0006410.0006720.000638
0.0203350.0203150.0199490.0204110.019931
30I0.0006150.0006360.0008800.0008430.0007320.000634Lindley
0.0199020.0198400.0232240.0222580.0205810.019276
0.0006310.0008060.0007610.0006760.000605
0.0198710.0220810.0215490.0199640.019266
0.0007570.0007230.0006670.0006810.000648TK
0.0215850.0212830.0198950.0202030.019780
0.0007860.0006440.0006200.0006430.000605
0.0214980.0199610.0199650.0199870.019543
II0.0006150.0006920.0008410.0008150.0006750.000647Lindley
0.0198460.0206060.0224560.0220720.0200720.019722
0.0006270.0007550.0007460.0006120.000591
0.0194960.0214890.0212900.0190240.018930
0.0007660.0006740.0006710.0007080.000636TK
0.0213110.0205480.0206870.0204800.019429
0.0006940.0006600.0006170.0006320.000615
0.0203440.0204370.0194280.0198440.019475
III0.0006110.0006350.0008010.0007910.0006840.000634Lindley
0.0192980.0197700.0219730.0218220.0200580.019568
0.0006160.0007460.0007380.0006170.000584
0.0196450.0210320.0209390.0193410.018644
0.0007290.0006610.0006560.0006820.000646TK
0.0210960.0201320.0199800.0206410.020678
0.0006650.0006230.0006140.0006430.000621
0.0202420.0194980.0196810.0198570.019696
1008230I0.0005700.0006260.0008160.0007950.0006630.000606Lindley
0.0192440.0196190.0225220.0222130.0199430.019300
0.0006130.0007870.0007480.0006370.000554
0.0194750.0218350.0214190.0195860.018017
0.0007020.0006590.0006400.0006590.000622TK
0.0208320.0201620.0193710.0196720.019578
0.0006880.0006300.0006090.0006350.000581
0.0204240.0195360.0196740.0196140.019052
II0.0005750.0006730.0007510.0007160.0006390.000604Lindley
0.0193000.0205830.0211770.0209810.0196890.019076
0.0006120.0007180.0007150.0005890.000545
0.0195900.0207960.0208320.0184790.018247
0.0006240.0006460.0005940.0006020.000548TK
0.0197740.0200690.0189640.0192900.018183
0.0006220.0005970.0005620.0005810.000521
0.0194440.0193350.0188760.0189470.018310
III0.0005660.0006210.0008010.0007820.0006760.000631Lindley
0.0188450.0200900.0226120.0221320.0199840.019142
0.0006100.0007440.0007360.0006370.000557
0.0193610.0215490.0216700.0198700.017859
0.0006580.0006550.0006200.0006590.000634TK
0.0204210.0201550.0200330.0201080.019788
0.0006390.0006020.0006010.0006270.000608
0.0199890.0192900.0194320.0199420.019825
45I0.0005500.0005920.0007950.0007860.0006310.000599Lindley
0.0188170.0190870.0221070.0214570.0194490.018992
0.0005950.0007700.0007680.0006160.000530
0.0194440.0213240.0213470.0192360.017887
0.0006610.0006510.0006220.0006440.000613TK
0.0199280.0200140.0194530.0195740.019078
0.0006580.0006120.0006040.0005820.000569
0.0199100.0197070.0192470.0190640.019261
II0.0005230.0006710.0007370.0007140.0006310.000595Lindley
0.0184440.0199150.0207920.0208600.0196660.019158
0.0005810.0007150.0007090.0005610.000528
0.0192260.0208320.0209400.0182870.017826
0.0005890.0005980.0005910.0005980.000585TK
0.0190830.0193220.0193120.0190110.018811
0.0005740.0005780.0005110.0005780.000511
0.0185960.0191140.0179820.0192060.018079
III0.0005390.0006180.0007830.0007570.0006630.000615Lindley
0.0185970.0195250.0222040.0215310.0197970.019171
0.0005970.0007360.0007310.0006350.000542
0.0193210.0212230.0211960.0190900.017538
0.0006500.0006450.0006170.0006420.000610TK
0.0200410.0203390.0197630.0199040.019756
0.0006310.0005790.0005700.0006090.000602
0.0197490.0191510.0189600.0198550.019514
9430I0.0005220.0005450.0006920.0006680.0005620.000547Lindley
0.0185290.0184530.0205090.0201630.0182940.018571
0.0005310.0006500.0006260.0005480.000527
0.0178910.0201600.0190990.0183460.017668
0.0006400.0006180.0006020.0006230.000490TK
0.0196970.0194830.0194700.0191860.017301
0.0006090.0005670.0005540.0005620.000506
0.0190950.0186960.0183250.0186570.017774
II0.0004820.0005770.0006730.0006610.0006080.000586Lindley
0.0174480.0186300.0199990.0199950.0194400.018809
0.0005280.0006450.0006380.0005520.000505
0.0180330.0198690.0192850.0181970.017324
0.0005800.0005580.0005180.0005690.000545TK
0.0187510.0184590.0179720.0182440.017945
0.0005550.0005180.0004930.0005530.000491
0.0184250.0180580.0172740.0183420.017762
III0.0005070.0005390.0006360.0006190.0005830.000561Lindley
0.0178520.0183430.0195980.0190430.0187880.018717
0.0005240.0006100.0005920.0005490.000487
0.0180980.0191720.0189370.0177390.017465
0.0005670.0005460.0005210.0005710.000563TK
0.0186630.0185320.0181980.0187910.018845
0.0005470.0005300.0005080.0005330.000521
0.0182950.0180220.0176560.0182780.017970
45I0.0004890.0005180.0006650.0006430.0005320.000469Lindley
0.0177700.0179430.0203330.0198430.0177800.016934
0.0004940.0006410.0006080.0005170.000485
0.0174680.0197710.0191580.0174130.017298
0.0005800.0005710.0005610.0005640.000479TK
0.0184670.0188270.0182950.0185450.017103
0.0005550.0005280.0004990.0005420.000482
0.0181410.0180120.0176840.0184590.017565
II0.0004730.0005130.0006260.0005820.0005760.000501Lindley
0.0172160.0179910.0192670.0186710.0185180.017220
0.0005100.0006120.0005520.0005230.000460
0.0175700.0194170.0183220.0180260.016693
0.0005410.0005390.0005250.0005400.000516TK
0.0181650.0182850.0181300.0180070.017769
0.0005320.0004890.0004790.0005170.000483
0.0180140.0175190.0173450.0179550.017515
III0.0004950.0005240.0006140.0006100.0005750.000478Lindley
0.0178740.0180730.0188820.0193700.0189010.017097
0.0005180.0006010.0005850.0005420.000467
0.0180460.0188480.0187600.0181520.017012
0.0005480.0005360.0004970.0005500.000518TK
0.0183630.0181560.0175000.0184590.018336
0.0005400.0005040.0004770.0005210.000509
0.0182490.0177570.0171680.0177430.017770

References

  1. Contreras-Reyes, J.E.; Quintero, F.O.L.; Wiff, R. Bayesian modeling of individual growth variability using back-calculation: Application to pink cusk-eel (Genypterus blacodes) off Chile. Ecol. Model. 2018, 385, 145–153. [Google Scholar] [CrossRef]
  2. Kumaraswamy, P. A generalized probability density function for double-bounded random processes. J. Hydrol. 1980, 46, 79–88. [Google Scholar] [CrossRef]
  3. Mitnik, P. New properties of the kumaraswamy distribution. Commun. Stat. 2013, 42, 741–755. [Google Scholar] [CrossRef]
  4. Fletcher, S.G.; Ponnambalam, K. Estimation of reservoir yield and storage distribution using moments analysis. J. Hydrol. 1996, 182, 259–275. [Google Scholar] [CrossRef]
  5. Sundar, V.; Subbiah, K. Application of double bounded probability density function for analysis of ocean waves. Ocean Eng. 1989, 16, 193–200. [Google Scholar] [CrossRef]
  6. Ponnambalam, K.; Seifi, A. Probabilistic design of systems with general distributions of parameters. Int. J. Circuit Theory Appl. 2001, 29, 527–536. [Google Scholar] [CrossRef]
  7. Seifi, A.; Ponnambalam, K. Maximization of manufacturing yield of systems with arbitrary distributions of component values. Ann. Oper. Res. 2020, 99, 373–383. [Google Scholar] [CrossRef]
  8. Lemonte, A.J. Improved point estimation for the kumaraswamy distribution. J. Stat. Comput. Simul. 2011, 81, 1971–1982. [Google Scholar] [CrossRef]
  9. Ghosh, I.; Nadarajah, S. On the bayesian inference of kumaraswamy distributions based on censored samples. Commun. Stat. 2017, 46, 8760–8777. [Google Scholar] [CrossRef]
  10. Sultana, F.; Tripathi, Y.M.; Rastogi, M.K.; Wu, S.-J. Parameter estimation for the kumaraswamy distribution based on hybrid censoring’. Am. J. Math. Manag. Sci. 2018, 37, 243–261. [Google Scholar] [CrossRef]
  11. Sultana, F.; Tripathi, Y.M.; Wu, S.-J.; Sen, T. Inference for kumaraswamy distribution based on type i progressive hybrid censoring. Ann Data Sci. 2020, 1–25. [Google Scholar] [CrossRef]
  12. Meeker, W.Q.; Escobar, L.A. Statistical Methods for Reliability Data; Wiley: New York, NY, USA, 1998. [Google Scholar]
  13. Cohen, A.C. Progressively censored samples in life testing. Technometrics 1963, 5, 327–339. [Google Scholar] [CrossRef]
  14. Kundu, D.; Joarder, A. Analysis of type-ii progressively hybrid censored data. Comput. Stat. Data Anal. 2006, 50, 2509–2528. [Google Scholar] [CrossRef]
  15. Cho, Y.; Sun, H.; Lee, K. An estimation of the entropy for a rayleigh distribution based on doubly-generalized type-ii hybrid censored samples. Entropy 2014, 16, 3655–3669. [Google Scholar] [CrossRef] [Green Version]
  16. Kundu, D. On hybrid censored weibull distribution. J. Stat. Plann. Inference 2007, 137, 2127–2142. [Google Scholar] [CrossRef] [Green Version]
  17. Zellner, A. Bayesian estimation and prediction using asymmetric loss functions. Publ. Am. Stat. Assoc. 1986, 81, 446–451. [Google Scholar] [CrossRef]
  18. Kundu, D.; Pradhan, B. Bayesian inference and life testing plans for generalized exponential distribution. Sci. China 2009, 52, 1373–1388. [Google Scholar] [CrossRef] [Green Version]
  19. Lindley, D.V. Approximate bayesian methods. Trabajos de Estadistica y de Investigacion Operativa 1980, 31, 223–245. [Google Scholar] [CrossRef]
  20. Tierney, L.; Kadane, J.B. Accurate approximations for posterior moments and marginal densities. J. Am. Stat. Assoc. 1986, 81, 82–86. [Google Scholar] [CrossRef]
  21. Howlader, H.A.; Hossain, A.M. Bayesian survival estimation of pareto distribution of the second kind based on failure-censored data. Comput. Stat. Data Anal. 2002, 38, 301–314. [Google Scholar] [CrossRef]
  22. Balakrishnan, N.; Sandhu, R.A. A simple simulational algorithm for generating progressive type-ii censored samples. Am. Stat. 1995, 49, 229–230. [Google Scholar]
  23. Kohansal, A. On estimation of reliability in a multicomponent stress-strength model for a kumaraswamy distribution based on progressively censored sample. Stat. Pap. 2017, 60, 2185–2224. [Google Scholar] [CrossRef]
Figure 1. pdf of K( α , β ).
Figure 1. pdf of K( α , β ).
Entropy 22 01032 g001
Figure 2. cdf of K( α , β ).
Figure 2. cdf of K( α , β ).
Entropy 22 01032 g002
Figure 3. Generalized progressive hybrid censoring scheme.
Figure 3. Generalized progressive hybrid censoring scheme.
Entropy 22 01032 g003
Figure 4. Histogram of real dataset.
Figure 4. Histogram of real dataset.
Entropy 22 01032 g004
Figure 5. Empirical Cumulative Distributions. Left panel: K( α , β ); middle panel: EED; right panel: LD.
Figure 5. Empirical Cumulative Distributions. Left panel: K( α , β ); middle panel: EED; right panel: LD.
Entropy 22 01032 g005
Figure 6. Quantile-Quantile Plots. Left panel: K( α , β ); middle panel: EED; right panel: LD.
Figure 6. Quantile-Quantile Plots. Left panel: K( α , β ); middle panel: EED; right panel: LD.
Entropy 22 01032 g006
Table 1. The dataset of monthly water capacity of the Shasta reservoir.
Table 1. The dataset of monthly water capacity of the Shasta reservoir.
0.6671570.2877850.1269770.7685630.7031190.7299860.767135
0.8111590.8295690.7261640.4238130.7151580.6403950.363359
0.4637260.3719040.2911720.4140870.6506910.5380820.744881
0.7226130.5612380.8139640.7090250.6686120.5249470.605979
0.7158500.5295180.8248600.7420250.4687820.3450750.425334
0.7670700.6798290.6139110.4616180.2948340.3929170.688100
Table 2. Goodness of different fitted distributions for real data.
Table 2. Goodness of different fitted distributions for real data.
Distribution−ln(L)AICAICcBICK-Sp-Value
K( α , β )−15.6310−27.2619−26.9543−23.78660.19050.4355
EED−6.1639−8.3278−8.0201−4.85240.23810.1859
LD19.517843.035743.343446.51100.35710.0089
Table 3. The ordered progressive Type-II censored dataset.
Table 3. The ordered progressive Type-II censored dataset.
0.1269770.2911720.3450750.3719040.4140870.4253340.463726
0.5249470.5380820.6059790.6403950.6671570.6798290.703119
0.7151580.7226130.7299860.7448810.7671350.8111590.824860
Table 4. Estimates of α , β and R for the real dataset.
Table 4. Estimates of α , β and R for the real dataset.
CaseParameterMLESELLLEL
p = 0.2p = 0.8q = 0.2q = 0.8
I α 3.2588542.8446671.5279201.9723491.7684851.909330Lindley
2.8873952.7398862.5910682.7734922.714300TK
β 2.1631971.8463370.1944500.7326730.8652461.012581Lindley
1.7966991.5838371.4653621.6010741.510759TK
R0.3413580.3694780.4936680.4984730.5185360.537528Lindley
0.3901560.3785320.3760200.3771400.370281TK
II α 2.8989932.5567241.7376471.9264441.8480181.916395Lindley
2.5459692.4006962.2662082.4286312.367205TK
β 1.4253351.2542720.5219800.6358280.5538930.640222Lindley
1.1819961.0551881.0002711.0560850.997650TK
R0.4439530.4654540.5361160.5362500.5352920.534027Lindley
0.4880820.4792120.4768300.4783430.473250TK
III α 3.1453852.7816072.0224792.1974712.1100482.170816Lindley
2.8314472.6967372.5615942.7268142.672566TK
β 1.7062301.4934700.7511480.8742020.8010780.881050Lindley
1.4654031.3297261.2583891.3362651.275308TK
R0.1154020.1542360.2268580.2292740.2605010.287326Lindley
0.1730290.1526740.1508030.1490990.135818TK

Share and Cite

MDPI and ACS Style

Tu, J.; Gui, W. Bayesian Inference for the Kumaraswamy Distribution under Generalized Progressive Hybrid Censoring. Entropy 2020, 22, 1032. https://0-doi-org.brum.beds.ac.uk/10.3390/e22091032

AMA Style

Tu J, Gui W. Bayesian Inference for the Kumaraswamy Distribution under Generalized Progressive Hybrid Censoring. Entropy. 2020; 22(9):1032. https://0-doi-org.brum.beds.ac.uk/10.3390/e22091032

Chicago/Turabian Style

Tu, Jiayi, and Wenhao Gui. 2020. "Bayesian Inference for the Kumaraswamy Distribution under Generalized Progressive Hybrid Censoring" Entropy 22, no. 9: 1032. https://0-doi-org.brum.beds.ac.uk/10.3390/e22091032

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop