Compute Shannon's Mutual Information based on the identity \(I(X,Y) =
H(X) + H(Y) - H(X,Y)\) based on a given joint-probability vector \(P(X,Y)\)
and probability vectors \(P(X)\) and \(P(Y)\).
Usage
MI(x, y, xy, unit = "log2")
Value
Shannon's Mutual Information in bit.
Arguments
x
a numeric probability vector \(P(X)\).
y
a numeric probability vector \(P(Y)\).
xy
a numeric joint-probability vector \(P(X,Y)\).
unit
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.
Author
Hajk-Georg Drost
Details
This function might be useful to fastly compute Shannon's Mutual Information
for any given joint-probability vector and probability vectors.
References
Shannon, Claude E. 1948. "A Mathematical Theory of
Communication". Bell System Technical Journal27 (3): 379-423.