Learn R Programming

entropy (version 1.3.2)

entropy.plugin: Plug-In Entropy Estimator

Description

entropy.plugin computes the Shannon entropy H of a discrete random variable with the specified frequencies (probability mass function).

Usage

entropy.plugin(freqs, unit=c("log", "log2", "log10"))

Value

entropy.plugin returns the Shannon entropy.

Arguments

freqs

frequencies (probability mass function).

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Author

Korbinian Strimmer (https://strimmerlab.github.io).

Details

The Shannon entropy of a discrete random variable is defined as \(H = -\sum_k p(k) \log( p(k) )\), where \(p\) is its probability mass function.

See Also

entropy, entropy.empirical, entropy.shrink, mi.plugin, KL.plugin, discretize.

Examples

Run this code
# load entropy library 
library("entropy")

# some frequencies
freqs = c(0.2, 0.1, 0.15, 0.05, 0, 0.3, 0.2)  

# and corresponding entropy
entropy.plugin(freqs)

Run the code above in your browser using DataLab