Learn R Programming

parmigene (version 1.1.1)

knnmi: Parallel Mutual Information Estimation

Description

A function to perform a parallel estimation of the mutual information of vectors x and y using entropy estimates from K-nearest neighbor distances.

Usage

knnmi(x, y, k=3, noise=1e-12)

Arguments

x

a numeric vector.

y

a numeric vector with the same length of x.

k

the number of nearest neighbors to be considered to estimate the mutual information. Must be less than the number of elements of x.

noise

the magnitude of the random noise added to break ties.

Details

The function adds a small random noise to the data in order to break ties due to limited numerical precision.

By default, the function uses all available cores. You can set the actual number of threads used to N by exporting the environment variable OMP_NUM_THREADS=N.

References

Kraskov, Alexander and Stogbauer, Harald and Grassberger, Peter. Estimating mutual information. Phys. Rev. E, 2004.

See Also

knnmi.cross

knnmi.all

Examples

Run this code
x <- rnorm(100)
y <- rnorm(100)
knnmi(x, y, 5)

Run the code above in your browser using DataLab