Learn R Programming

philentropy (version 0.8.0)

MI: Shannon's Mutual Information \(I(X,Y)\)

Description

Compute Shannon's Mutual Information based on the identity \(I(X,Y) = H(X) + H(Y) - H(X,Y)\) based on a given joint-probability vector \(P(X,Y)\) and probability vectors \(P(X)\) and \(P(Y)\).

Usage

MI(x, y, xy, unit = "log2")

Value

Shannon's Mutual Information in bit.

Arguments

x

a numeric probability vector \(P(X)\).

y

a numeric probability vector \(P(Y)\).

xy

a numeric joint-probability vector \(P(X,Y)\).

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Author

Hajk-Georg Drost

Details

This function might be useful to fastly compute Shannon's Mutual Information for any given joint-probability vector and probability vectors.

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

H, JE, CE

Examples

Run this code

MI( x = 1:10/sum(1:10), y = 20:29/sum(20:29), xy = 1:10/sum(1:10) )

Run the code above in your browser using DataLab