Learn R Programming

FNN (version 1.1.4.1)

KL.divergence: Kullback-Leibler Divergence

Description

Compute Kullback-Leibler divergence.

Usage

KL.divergence(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.divergence(X, Y, k = 10, algorithm="kd_tree")

Value

Return the Kullback-Leibler divergence from X to Y.

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Author

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com

Details

If p(x) and q(x) are two continuous probability density functions, then the Kullback-Leibler divergence of q from p is defined as \(E_p[\log \frac{p(x)}{q(x)}]\).

KL.* versions return divergences from C code to R but KLx.* do not.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266--1283.

See Also

KL.dist

Examples

Run this code
    set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)

    KL.divergence(X, Y, k=5)
    #theoretical divergence = log(0.2/0.4)+(0.4/0.2)-1 = 1-log(2) = 0.307

Run the code above in your browser using DataLab