Learn R Programming

bayesmeta (version 3.4)

kldiv: Kullback-Leibler divergence of two multivariate normal distributions.

Description

Compute the Kullback-Leiber divergence or symmetrized KL-divergence based on means and covariances of two normal distributions.

Usage

kldiv(mu1, mu2, sigma1, sigma2, symmetrized=FALSE)

Value

The divergence (\(D_{\mathrm{KL}} \geq 0 \) or \(D_{\mathrm{s}} \geq 0 \)).

Arguments

mu1, mu2

the two mean vectors.

sigma1, sigma2

the two covariance matrices.

symmetrized

logical; if TRUE, the symmetrized divergence will be returned.

Details

The Kullback-Leibler divergence (or relative entropy) of two probability distributions \(p\) and \(q\) is defined as the integral $$D_{\mathrm{KL}}(p\,||\,q) = \int_\Theta \log\Bigl(\frac{p(\theta)}{q(\theta)}\Bigr)\, p(\theta)\, \mathrm{d}\theta.$$

In the case of two normal distributions with mean and variance parameters given by (\(\mu_1\), \(\Sigma_1\)) and (\(\mu_2\), \(\Sigma_2\)), respectively, this results as $$D_{\mathrm{KL}}\bigl(p(\theta|\mu_1,\Sigma_1)\,||\,p(\theta|\mu_2,\Sigma_2)\bigr) = \frac{1}{2}\biggl(\mathrm{tr}(\Sigma_2^{-1} \Sigma_1) + (\mu_1-\mu_2)^\prime \Sigma_2^{-1} (\mu_1-\mu_2) - d + \log\Bigl(\frac{\det(\Sigma_2)}{\det(\Sigma_1)}\Bigr)\biggr)$$ where \(d\) is the dimension.

The symmetrized divergence simply results as $$D_{\mathrm{s}}(p\,||\,q)=D_{\mathrm{KL}}(p\,||\,q)+D_{\mathrm{KL}}(q\,||\,p).$$

References

S. Kullback. Information theory and statistics. John Wiley and Sons, New York, 1959.

C. Roever, T. Friede. Discrete approximation of a mixture distribution via restricted divergence. Journal of Computational and Graphical Statistics, 26(1):217-222, 2017. tools:::Rd_expr_doi("10.1080/10618600.2016.1276840").

See Also

bmr.

Examples

Run this code
kldiv(mu1=c(0,0), mu2=c(1,1), sigma1=diag(c(2,2)), sigma2=diag(c(3,3)))

Run the code above in your browser using DataLab