multiinformation
takes a dataset as input and computes the
multiinformation (also called total correlation) among the random variables in the dataset.
The value is returned in nats using the entropy estimator estimator
.
multiinformation(X, method ="emp")
data.frame containing a set of random variables where columns contain variables/features and rows contain outcomes/samples.
The name of the entropy estimator. The package implements four estimators :
"emp", "mm", "shrink", "sg" (default:"emp") - see details.
These estimators require discrete data values - see discretize
.
multiinformation
returns the multiinformation (also called total correlation) among the variables in the dataset (in nats).
"emp" : This estimator computes the entropy of the empirical probability distribution.
"mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator.
"shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution.
"sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution.
Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.
Studeny, M. and Vejnarova, J. (1998). The multiinformation function as a tool for measuring stochastic dependence. In Proceedings of the NATO Advanced Study Institute on Learning in graphical models,
condinformation
, mutinformation
, interinformation
, natstobits
# NOT RUN {
data(USArrests)
dat<-discretize(USArrests)
M <- multiinformation(dat)
# }
Run the code above in your browser using DataLab