bnlearn implements key algorithms covering all stages of Bayesian
network modelling: data preprocessing, structure learning combining data and
expert/prior knowledge, parameter learning, and inference (including causal
inference via do-calculus). bnlearn aims to be a one-stop shop for
Bayesian networks in R, providing the tools needed for learning and working
with discrete Bayesian networks, Gaussian Bayesian networks and conditional
linear Gaussian Bayesian networks on real-world data. Incomplete data with
missing values are also supported. Furthermore the modular nature of
bnlearn makes it easy to use it for simulation studies.
Implemented structure learning algorithms include:
Constraint-based algorithms, which use conditional independence
tests to learn conditional independence constraints from data. The
constraints in turn are used to learn the structure of the Bayesian
network under the assumption that conditional independence implies
graphical separation (so, two variables that are independent cannot be
connected by an arc).
Score-based algorithms, which are general-purpose optimization
algorithms that rank network structures with respect to a goodness-of-fit
score.
Hybrid algorithms combine aspects of both constraint-based and
score-based algorithms, as they use conditional independence tests
(usually to reduce the search space) and network scores (to find the
optimal network in the reduced space) at the same time.
For more details about structure learning algorithms see
structure learning; available conditional independence tests are
described in independence tests and available network scores are
described in network scores. Specialized algorithms to learn the
structure of Bayesian network classifiers are described in
network classifiers. All algorithms support the use of whitelists and
blacklists to include and exclude arcs from the networks (see
whitelists and blacklists); and many have parallel implementation
built on the parallel package. Bayesian network scores support the use
of graphical priors.
Parameter learning approaches include both frequentist and Bayesian
estimators. Inference is implemented using approximate algorithms via particle
filters approaches such as likelihood weighting, and covers conditional
probability queries, prediction and imputation.
Additional facilities include support for bootstrap and cross-validation;
advanced plotting capabilities implemented on top of Rgraphviz and
lattice; model averaging; random graphs and random samples generation;
import/export functions to integrate bnlearn with software such as
Hugin and GeNIe; an associated Bayesian network repository of golden-standard
networks at https://www.bnlearn.com/bnrepository/.
Use citation("bnlearn")
to find out how to cite bnlearn in
publications and other materials; and visit https://www.bnlearn.com/ for
more examples and code from publications using bnlearn.