(Information) Entropy of a distribution
entropy(object, base = 2)
Distribution.
base of the entropy logarithm, default = 2 (Shannon entropy)
Entropy with given base as a numeric.
$entropy(base = 2)
The entropy of a (discrete) distribution is defined by $$- \sum (f_X)log(f_X)$$ where \(f_X\) is the pdf of distribution X, with an integration analogue for continuous distributions. The base of the logarithm of the equation determines the type of entropy computed. By default we use base 2 to compute entropy in 'Shannons' or 'bits'.
If an analytic expression isn't available, returns error. To impute a numerical expression, use the
CoreStatistics
decorator.