Learn R Programming

entropy (version 1.3.1)

mi.plugin: Plug-In Estimator of Mutual Information and of the Chi-Squared Statistic of Independence

Description

mi.plugin computes the mutual information of two discrete random variables from the specified joint probability mass function.

chi2indep.plugin computes the chi-squared divergence of independence.

Usage

mi.plugin(freqs2d, unit=c("log", "log2", "log10"))
chi2indep.plugin(freqs2d, unit=c("log", "log2", "log10"))

Arguments

freqs2d

matrix of joint bin frequencies (joint probability mass function).

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Value

mi.plugin returns the mutual information.

chi2indep.plugin returns the chi-squared divergence of independence.

Details

The mutual information of two random variables \(X\) and \(Y\) is the Kullback-Leibler divergence between the joint density/probability mass function and the product independence density of the marginals.

It can also defined using entropy as \(MI = H(X) + H(Y) - H(X, Y)\).

Similarly, the chi-squared divergence of independence is the chi-squared divergence between the joint density and the product density. It is a second-order approximation of twice the mutual information.

See Also

mi.Dirichlet, mi.shrink, mi.empirical, KL.plugin, discretize2d.

Examples

Run this code
# NOT RUN {
# load entropy library 
library("entropy")

# joint distribution of two discrete variables
freqs2d = rbind( c(0.2, 0.1, 0.15), c(0.1, 0.2, 0.25) )  

# corresponding mutual information
mi.plugin(freqs2d)

# MI computed via entropy
H1 = entropy.plugin(rowSums(freqs2d))
H2 = entropy.plugin(colSums(freqs2d))
H12 = entropy.plugin(freqs2d)
H1+H2-H12

# and corresponding (half) chi-squared divergence of independence
0.5*chi2indep.plugin(freqs2d)

# }

Run the code above in your browser using DataLab