AIC.moc
generates a table of \(\log(Likelihood)\), AIC, BIC,
ICL-BIC and entropy values along with the degrees of freedom of multiple moc
objects.
logLik
returns an object of class logLik
containing the
\(\log(Likelihood)\), degrees of freedom and number of observations.
loglike.moc
computes the \(\log(Likelihood)\) of a moc
object evaluated at the supplied parameters values, contrary to
logLik
above which uses the estimated parameter values. It gives
the option to re-evaluate the model in which case the supplied
parameter values are used as new starting values.
entropy
is a generic method to compute the entropy of
sets of probabilities.
The entropy of a set of \(k\) probabilities (\(\pi_1,\ldots,\pi_k\)) is computed as \(entropy = - \sum_{i=1}^k(\pi_i \log(\pi_i) )\), it reaches its minimum of \(0\) when one of the \(\pi_i=1\) (minimum uncertainty) and its maximum of \(\log(k)\) when all probabilities are equal to \(\pi_i=1/k\) (maximum uncertainty). Standardized entropy is just \(entropy/\log(k)\) which lies in the interval \([0,1]\). The total and mean mixture entropy are the weighted sum and mean of the mixture probabilities entropy of all subjects. These are computed for both the prior (without knowledge of the response patterns) and the posterior mixture probabilities (with knowledge of the responses).
The default method entropy.default
compute entropy and
standardized entropy of a set of probabilities.
entropy.moc
generates a table containing weighted total and
mean standardized entropy of prior and posterior mixture probabilities
of moc
models.
# S3 method for moc
AIC(object, …, k = 2)# S3 method for moc
logLik(object, …)
loglike.moc(object, parm = object$coef, evaluate = FALSE)
# S3 method for moc
entropy(object, …)
Objects of class moc
.
Can be any real number or the string "BIC".
Parameters values at which the \(\log(Likelihood)\) is evaluated.
Boolean indicating whether re-evaluation of the model
is desired. If TRUE
parm
will be used as new starting values.
AIC.moc
returns a data frame with the relevant
information for one or more moc
objects.
The likelihood methods works on a single moc
object:
logLik.moc
returns an object of class logLik
with
attributes df, nobs and moc.name while
loglike.moc
returns a matrix containing \(\log(Likelihood)\)
and corresponding estimated parameters with attributes moc.name and
parameters.
entropy.moc
returns a data.frame
with number of groups,
total and mean standardized prior and posterior entropy of multiple
moc
objects. The percentage of reduction from prior to
posterior entropy within a model is also supplied.
The computed value in AIC.moc
is \(-2\cdot \log(Likelihood) +
k\cdot npar\).
Specific treatment is carried for BIC
(\(k=\log(nsubject\cdot nvar)\)),
AIC (\(k = 2\)) and \(\log(Likelihood)\) (\(k = 0\)).
Setting k = "BIC", will produce a table with BIC, mixture posterior
\(entropy = - \sum_{i,k}( wt_i\cdot \hat{\tau}_{i,k}\,
\log(\hat{\tau}_{i,k}) )\)
which is an indicator of mixture separation, df and
\(ICL-BIC = BIC + 2\cdot entropy\) which is
an entropy corrected BIC, see McLachlan, G. and Peel, D. (2000) and
Biernacki, C. et al. (2000).
McLachlan, G. and Peel, D. (2000) Finite mixture models, Wiley-Interscience, New York.
Biernacki, C., Celeux, G., Govaert, G. (2000) Assessing a Mixture Model with the Integrated Completed Likelihood, IEEE Transaction on Pattern Analysis and Machine Learning, 22, pp. 719--725.
moc
, confint.moc
, profiles.postCI
,
entropyplot.moc
, npmle.gradient