Learn R Programming

NetworkToolbox (version 1.4.2)

kld: Kullback-Leibler Divergence

Description

Estimates the Kullback-Leibler Divergence which measures how one probability distribution diverges from the original distribution (equivalent means are assumed) Matrices must be positive definite inverse covariance matrix for accurate measurement. This is a relative metric

Usage

kld(base, test)

Arguments

base

Full or base model

test

Reduced or testing model

Value

A value greater than 0. Smaller values suggest the probability distribution of the reduced model is near the full model

References

Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22, 79-86.

Examples

Run this code
# NOT RUN {
A1 <- solve(cov(neoOpen))

# }
# NOT RUN {
A2 <- LoGo(neoOpen)

kld_value <- kld(A1, A2)
# }
# NOT RUN {
# }

Run the code above in your browser using DataLab