pairwise Kullback-Leibler divergence index (matrix)
Arguments
object
Matrix or dataframe object with >=2 columns
eps
Probabilities below this threshold are replaced by this
threshold for numerical stability.
overlap
Logical, do not determine the KL divergence for those
pairs where for each point at least one of the densities
has a value smaller than eps.
Author
Jeffrey S. Evans <jeffrey_evans@tnc.org>
Details
Calculates the Kullback-Leibler divergence (relative entropy) between
unweighted theoretical component distributions. Divergence is calculated
as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities
f() and g().
References
Kullback S., and R. A. Leibler (1951) On information and sufficiency.
The Annals of Mathematical Statistics 22(1):79-86
x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
matplot(x, y, type='l')
kl.divergence(y)
# extract value for last column kl.divergence(y[,1:2])[3:3]