numeric(1) the KL divergence is undefined if certain states of a discrete variable
have probabilities of 0. In this case, a small positive number epsilon is assigned as their probabilities for calculating
the divergence. The probabilities of other states are shrunked proportionally to ensure they sum up to 1.
Value
a plot of Bayesian network
a vector of signed symmetric Kullback-Leibler divergence
Details
Network visualization of the node-specific differences between Bayesian Networks
with the same topology, but evidence that has been absorbed and propagated. The change of
marginal distribution of each node is measured by signed and symmetric Kullback-Leibler
divergence. The sign indicates the direction of change, with tree.1 considered as the baseline.
The magnitude of the change is reflected by the value. Nodes that are white are d-separated
from the evidence. This function requires Rgraphviz package.
References
Cowell, R. G. (2005). Local propagation in conditional Gaussian Bayesian networks.
Journal of Machine Learning Research, 6(Sep), 1517-1550.
Yu H, Moharil J, Blair RH (2020). BayesNetBP: An R Package for Probabilistic Reasoning in Bayesian
Networks. Journal of Statistical Software, 94(3), 1-31. <doi:10.18637/jss.v094.i03>.