Estimates the Kullback-Leibler Divergence which measures how one probability distribution
diverges from the original distribution (equivalent means are assumed)
Matrices must be positive definite inverse covariance matrix for accurate measurement.
This is a relative metric
Usage
kld(base, test)
Arguments
base
Full or base model
test
Reduced or testing model
Value
A value greater than 0.
Smaller values suggest the probability distribution of the reduced model is near the full model
References
Kullback, S., & Leibler, R. A. (1951).
On information and sufficiency.
The Annals of Mathematical Statistics, 22, 79-86.