Learn R Programming

philentropy (version 0.5.0)

H: Shannon's Entropy \(H(X)\)

Description

Compute the Shannon's Entropy \(H(X) = - \sum P(X) * log2(P(X))\) based on a given probability vector \(P(X)\).

Usage

H(x, unit = "log2")

Arguments

x

a numeric probability vector \(P(X)\) for which Shannon's Entropy \(H(X)\) shall be computed.

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Value

a numeric value representing Shannon's Entropy in bit.

Details

This function might be useful to fastly compute Shannon's Entropy for any given probability vector.

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

JE, CE, KL, JSD, gJSD

Examples

Run this code
# NOT RUN {
H(1:10/sum(1:10))

# }

Run the code above in your browser using DataLab