Learn R Programming

synRNASeqNet (version 1.0)

entropyML: Maximum Likelihood Entropy Estimate

Description

Computing the Maximul Likelihood Entropy Estimate of cellCounts.

Usage

entropyML(cellCounts, unit = unit)

Arguments

cellCounts
an integer vector (or matrix) representing the number of times each particular count is obtained.
unit
the unit in which entropy is measured. One of "bit" (log2, default), "ban" (log10) or "nat" (natural units).

Value

The entropyML function returns the value of the entropy of that gene H(X) (or pair of genes H(X,Y)).

References

Paniski L. (2003). Estimation of Entropy and Mutual Information. Neural Computation, vol. 15 no. 6 pp. 1191-1253. Meyer P.E., Laffitte F., Bontempi G. (2008). minet: A R/Bioconductor Package for Inferring Large Transcriptional Networks Using Mutual Information. BMC Bioinformatics 9:461. Antos A., Kontoyiannis I. (2001). Convergence properties of functional estimates for discrete distributions. Random Structures and Algorithms, vol. 19 pp. 163-193. Strong S., Koberle R., de Ruyter van Steveninck R.R., Bialek W. (1998). Entropy and Information in Neural Spike Trains. Physical Review Letters, vol. 80 pp. 197-202.

See Also

entropyMM, entropyBayes, entropyCS, entropyShrink

Examples

Run this code
simData <- simulatedData(p = 50, n = 100, mu = 100, sigma = 0.25,
                        ppower = 0.73, noise = FALSE)
cellCounts <- table(simData$counts[1, ])
eML <- entropyML(cellCounts, unit = "nat")

Run the code above in your browser using DataLab