powered by
This funciton computes Shannon's Joint-Entropy \(H(X,Y) = - \sum \sum P(X,Y) * log2(P(X,Y))\) based on a given joint-probability vector \(P(X,Y)\).
JE(x, unit = "log2")
a numeric joint-probability vector \(P(X,Y)\) for which Shannon's Joint-Entropy \(H(X,Y)\) shall be computed.
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.
a numeric value representing Shannon's Joint-Entropy in bit.
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
H, CE, KL, JSD, gJSD, distance
H
CE
KL
JSD
gJSD
distance
# NOT RUN { JE(1:100/sum(1:100)) # }
Run the code above in your browser using DataLab