Regular Akaike's information criterion
(https://en.wikipedia.org/wiki/Akaike_information_criterion) (\(AIC\)) is
$$AIC = LL + 2p,$$
where \(LL\) is the maximized value of the log likelihood
(the minimized value of the negative log
likelihood) and \(p\) is the
number of coefficients estimated in the detection function. For
dfunc
objects, \(AIC\) = obj$loglik + 2*length(coef(obj))
.
A correction
for small sample size, \(AIC_c\), is
$$AIC_c = LL + 2p + \frac{2p(p+1)}{n-p-1},$$
where \(n\) is sample
size or number of detected groups for distance analyses. By default, this function
computes \(AIC_c\). \(AIC_c\) converges quickly to \(AIC\)
as \(n\) increases.
The Bayesian Information Criterion (BIC) is
$$BIC = LL + log(n)p,$$.