powered by
entropy.plugin computes the Shannon entropy H of a discrete random variable with the specified frequencies (probability mass function).
entropy.plugin
entropy.plugin(freqs, unit=c("log", "log2", "log10"))
frequencies (probability mass function).
the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".
unit="log2"
entropy.plugin returns the Shannon entropy.
The Shannon entropy of a discrete random variable is defined as \(H = -\sum_k p(k) \log( p(k) )\), where \(p\) is its probability mass function.
entropy, entropy.empirical, entropy.shrink, mi.plugin, KL.plugin, discretize.
entropy
entropy.empirical
entropy.shrink
mi.plugin
KL.plugin
discretize
# NOT RUN { # load entropy library library("entropy") # some frequencies freqs = c(0.2, 0.1, 0.15, 0.05, 0, 0.3, 0.2) # and corresponding entropy entropy.plugin(freqs) # }
Run the code above in your browser using DataLab