BBest(y,m,method="MM")
BBest
returns an object of class "BBest
". The function summary
(i.e., summary.BBest
) can be used to obtain or print a summary of the results. p
.phi
.psi
.BBest
function performs estimations in the parameters of a beta-binomial distribution for the given data. The estimations can be performed using two different approaches, the methods of moments and the maximum likelihood estimation approach. The density function of a given observation \(y\) that follows a beta-binomial distribution with paramters \(m\), \(p\) and \(phi\) is defined as
$$f(y)=[\Gamma(1/phi)*\Gamma(p/phi+m)*\Gamma((1-p)/phi+m)]/[\Gamma(1/phi+m)*\Gamma(p/phi)*\Gamma((1-p)/phi)].$$ The first and second order moments are defined as
$$E[y]=mp$$
$$Var[y]=mp(1-p)[1+(n-1)phi/(1+phi)].$$ Hence, if \(y=(y_1,...,y_n)\) is the given data, we can conclude the method of moments from the previous as
$$p=E/m,$$
$$phi=[V-mp(1-p)]/[mp(1-p)m-V],$$
where \(E\) is the sample mean and \(V\) is the sample variance. On the other hand, the maximum likelihood estimation of both parameters consits of solving the derivative of the log-likelihood defined by the density function with respect to each parameter and equaling them to zero. An iterative algorithm is needed for both parameter estimation as the score equations the parameters depend each other. The variance of the estimation of the probability parameter of the beta-binomial distribution for the given data set is computed by the inverse of the Fisher information, i.e., the inverse of the negative second derivate of the log-likelihodd remplacing \(p\) by its estimation.# We simulate 1000 observations of a beta-binomial distribution
# for the fixed paramters.
m <- 10
k <- 1000
p <- 0.7
phi <- 1.6
set.seed(5)
y <- rBB(k,m,p,phi)
# Performing the estimation of the parameters
# Method of moments:
MM <- BBest(y,m)
MM
# Maximum likelihood approach
MLE <- BBest(y,m,method="MLE")
MLE
Run the code above in your browser using DataLab