Learn R Programming

RelValAnalysis (version 1.0)

RelativeEntropy: Relative Entropy

Description

The function RelativeEntropy is used to compute the relative entropy between two probability distributions.

Usage

RelativeEntropy(p, q, group.index = NULL)

Arguments

Value

A non-negative number or +Inf if group.index is not given. A numeric vector if group.index is given.

Details

Relative entropy can be thought of as a measure of distance between two probability distributions. It is also known as the Kullback-Leibler divergence and is usually denoted by H(p|q). It is not a metric as it is not symmetric and it does not satisfy the triangle inequality. If there is an index i where q[i] == 0 but p[i] > 0, then the relative entropy is Inf. Mathematically, this happens when p is not absolutely continuous with respect to q. If group.index is provided the relative entropy will be decompoesd using the chain rule stated in Lemma 3.1(i) of Pal and Wong (2013), see equation (23) there. In this case the output has 1 + 1 + m components, where m is the number of groups defined by group.index. The first component is the left-hand-side of (23). The second component is the first term on the right-hand-side of (23). The other m components are the terms in the sum on the right-hand-side of (23).

References

Pal, S. and T.-K. L. Wong (2013). Energy, entropy, and arbitrage. arXiv preprint arXiv:1308.5376.

See Also

ShannonEntropy

Examples

Run this code
p <- c(0.3, 0.3, 0.4)
q <- c(0.5, 0.3, 0.2)

RelativeEntropy(p, q)
RelativeEntropy(q, p)  # relative entropy is not symmetric

Run the code above in your browser using DataLab