bnlearn (5.0.1)
* updated C code not to use R C APIs scheduled for removal.
bnlearn (5.0)
* the "effective" argument of nparams() is now deprecated and will be removed
by the end of 2025.
* the cross-validation loss functions "pred-lw", "cor-lw" and "mse-lw" are
now deprecated in favour of "pred", "cor" and "mse" with optional
arguments predict = "bayes-lw"; they will be removed in 2025.
* the cross-validation loss functions "logl-g" and "logl-cg" are now
deprecated in favour of "logl"; they will be removed in 2025.
* the cross-validation loss functions "pred", "cor" and "mse" can now be
computed with exact inference using predict = "exact".
* completed the implementation of KL(), which now supports conditional
Gaussian networks in addition to discrete and Gaussian ones.
* implemented Shannon's entropy.
* conditional independence tests now have optional arguments like network
scores.
* added a "custom-test" conditional independence test allowing user-provided
test statistics in the same way as "custom" allows user-provided network
scores.
* the custom score now has label "custom-score", instead of just "custom",
for clarity and to make it consistent with "custom-test".
* added a "params.threshold" to hard EM methods in bn.fit(), and renamed
the log-likelihood threshold to "loglik.threshold".
* the log-likelihood stopping rule in hard EM now uses the log-likelihood of
the completed data, which works better in the presence of latent
variables and is more appropriate accoridng to Koller & Friedman (thanks
Laura Azzimonti).
* coefficients(), sigma(), fitted() and residuals() return an error when
called on bn objects instead of failing silently and returning NULL.
* preserve and return the probabilities from predict(..., prob = TRUE) when
using the parallel package (thanks Alex Rudge).
* logLik() now returns an object of class "logLik" with the expected
attributes.
* added an identifiable() function to tell whether a bn.fit object contains NA
parameter values; and a singular() function to tell whether it is
singular (with 0-1 probability distributions, zero standard errors).
* the entropy loss in bn.cv() is now estimated now uses the node-average
(log-)likelihood; it does not produce warnings for incomplete data and
may occasionally return +Inf instead of NA in some corner cases (for
instance, when the model learning from training has NA parameters or
conditional probabilities equal to zero).
bnlearn (4.9.3)
* fixed a buffer overflow by ASAN in the CRAN tests.
bnlearn (4.9.2)
* fixed a syntax error in a manpage to pass CRAN tests.
bnlearn (4.9.1)
* assorted fixes to the Rprintf() format strings to pass the CRAN tests.
* the default node shape in graphviz.plot(), strength.plot() and
graphviz.compare() is now "rectangle", which is more space-efficient for
typical node labels.
* graphviz.compare() now accepts bn.fit objects, converting them to the
corresponding bn objects to compare the respective network structures.
* fixed a segfault in ci.test(), triggered by setting the conditioning
variable set to a zero-column matrix (thanks Qingyuan Zheng).
bnlearn (4.9)
* as.prediction() is now deprecated and will be removed by the end of 2024.
* graphviz.plot(), strength.plot() and graphviz.compare() now have a
"fontsize" argument that controls the font size of the node labels.
* removed the rbn() method for bn objects.
* predict() and impute() can now use exact inference with method = "exact" for
discrete and Gaussian networks.
* the "custom" score now accepts incomplete data.
* it is now possible to use the "custom" score to implement custom Bayesian
scores in BF() and bf.strength().
* fixed the conditional probabilities computed by cpquery(), which now
disregards particles for which either the evidence or the event
expressions evaluate to NA (thanks Simon Rauch).
* added a complete.graph() function to complement empty.graph().
* removed the "empty" method of random.graph(), use empty.graph() instead.
* structural.em() can now use exact inference in the expectation step with
impute = "exact".
* structural.em() can now be called from bn.boot(), boot.strength() and
bn.cv().
* updated as.bn.fit() to work with the latest gRain release.
* fixed segfault in tree.bayes() with illegal whitelists and blacklists.
* predict(method = "bayes-lw") and predict(method = "exact") now work even
when from = character(0).
* implemented hard EM in bn.fit() with method = "hard-em" (discrete BNs),
method = "hard-em-g" (Gaussian BNs) and method = "hard-em-cg"
(conditional Gaussian BNs).
* predict() and impute() can use clusters from the parallel package with all
available methods.
* hard EM methods in bn.fit() can use clusters from the parallel package
like previously available parameter estimators.
* boot.strength() now shuffles the columns of the data by default, which seems
to broadly improve structural accuracy.
* constraint-based algorithms are now guaranteed to return a CPDAG; this was
not the case previously because shielded colliders were preserved along
with unshielded ones (thanks Ruben Camilo Wisskott).
* bn.fit() now works with network classifiers (thanks Riley Mulhern).
* logLik() has been re-implemented and now accepts incomplete data.
* tabu search now works with continuous data containing latent variables
(thanks David Purves).
* read.net() can now parse interval nodes (thanks Marco Valtorta).
* graphviz.chart() now handles text.col correctly even when it contains a
separate colour for each node.
* impute() now produces an error instead of returning data still containing
missing values (with "strict" set to TRUE, the default) or at least it
produces a warning (with "strict" set to FALSE).
* better sanitization of CPTs in custom.fit() (thanks Dave Costello).
* implemented the node-average (penalized) likelihood scores from Bodewes
and Scutari for discrete ("nal" and "pnal"), Gaussian ("nal-g" and
"pnal-g") and conditional Gaussian ("nal-cg" and "pnal-cg") BNs.
* predict() for bn.fit objects now accepts incomplete data, conditioning on
the observed values and averaging over the missing values in each
observations in the case of method = "bayes-lw" and method = "exact".
bnlearn (4.8.1)
* assorted fixes to the C code to pass the CRAN tests.
bnlearn (4.8)
* the rbn() method for bn objects is now deprecated and will be removed by the
end of 2023.
* removed choose.direction().
* implemented gbn2mvnorm(), which converts a Gaussian BN to its multivariate
normal global distribution, and mvnorm2gbn(), which does the opposite.
* added the extended BIC from Foygel and Drton.
* the maximum likelihood estimators in bn.fit() now have distinct labels "mle"
(discrete BNs), "mle-g" (Gaussian BNs) and "mle-cg" (conditional Gaussian
BNs) to identify them without ambiguity.
* graphviz.chart() now supports Gaussian and conditional Gaussian BNs
(thanks Tom Waddell).
* the "draw.levels" argument of graphviz.chart() has been renamed to
"draw.labels".
* chow.liu(), aracne() and tree.bayes() now handle data with missing values.
* implemented the hierarchical Dirichlet parameter estimator for related
data sets from Azzimonti, Corani and Zaffalon.
* all unidentifiable parameters are now NAs, including those that were NaNs
before, for consistency across node types.
* structural.em() now returns descriptive error messages when the data
contain latent variables (thanks Bernard Liew).
* implemented the Kullback-Leibler divergence for discrete and Gaussian
networks.
* fixed spurious errors in bn.boot() resulting from changes in the
all.equal() method for functions in R 4.1.0 (thanks Fred Gruber).
* added a mean() method that averages bn.fit objects with the same network
structure, with optional weights.
* bn.boot(), bn.cv() and boot.strength() now handle data with missing values.
bnlearn (4.7)
* removed the "moral" argument from vstructs() and cpdag().
* removed the path() alias to path.exists().
* the "nodes" argument has been removed from averaged.network(), it was not
meaningfully used anywhere.
* the choose.direction() function is now deprecated, and it will also be
removed by the end of 2022.
* faster sanitization of bn objects (thanks David Quesada).
* fixed an overflow in the BGe score that produced NaN values (thanks David
Quesada).
* Date and POSIXct objects are not valid inputs for functions in bnlearn
(thanks David Purves).
* export the function computing the significance threshold in
averaged.network() as inclusion.threshold() (thanks Noriaki Sato).
* fixed the sanitization of custom cutpoints in strength.plot().
* reimplemented discretize() in C for speed, Hartemink's discretization is
faster by a factor of at least 2x.
* discretize() now handles data with missing values.
* merged an implementation of the factorized NML and the quotient NML scores
from Tomi Silander.
bnlearn (4.6.1)
* Fixed out-of-bounds memory access in discretize() (thanks Brian Ripley).
bnlearn (4.6)
* removed support for parametric bootstrap in bn.boot().
* path() has been renamed path.exists(); path() will be kept as an alias
until 2021 when it will be removed to avoid clashing with BiocGenerics.
* the "moral" arguments of vstructs() and cpdag() are now deprecated and
it will be removed in 2021.
* fixed graphviz.chart(), which called plot.new() unnecessarily and created
empty figures when a graphical device such as pdf() was already open.
* added a "custom" (decomposable) score that takes a user-specified R
function to compute the score of local distributions in score() and
structure learning algorithms (thanks Laura Azzimonti).
* fixed spouses(), which now always returns a character vector.
* added an "including.evidence" argument to the as.bn.fit() method for grain
objects to carry over hard evidence in the conversion (thanks Rafal
Urbaniak).
* bn.fit() with missing data is now faster by 2x-3x.
* due to API changes, bnlearn now suggests gRain >= 1.3-3.
* fixed permutation tests, which incorrectly used a strict inequality when
computing the fraction of test statistics larger than that computed from
the original data (thanks David Purves).
* make cpdag() and vstructs() agree for both moral = FALSE and moral = TRUE
(thanks Bingling Wang).
* implemented colliders(), shielded.colliders() and unshielded.colliders();
vstructs() is now an alias of unshielded.colliders().
* added functions to import and export igraph objects.
* fixed pc.stable(), which failed on two-variables data sets.
* added utility functions set2blacklist(), add.node(), remove.node(),
rename.nodes().
* fixed h2pc(), which failed when encountering isolated nodes (thanks Kunal
Dang).
* better argument sanitization for threshold and cutpoints in strength.plot().
* fixed "newdata" argument sanitization for the pred-loglik-* scores.
* read.net() now disregards experience tables instead of generating an
error when importing NET files from Hugin (thanks Jirka Vomlel).
* fixed bug in mmpc(), which did return wrong maximum/minimum p-values
(thanks Jireh Huang).
bnlearn (4.5)
* the "parametric" option for the "sim" argument of bn.boot() is now
deprecated; the argument will be removed in 2020.
* removed the relevant() function, and the "strict" and "optimized" arguments
of constraint-based structure learning algorithms.
* save arc strengths as weights in the graph object returned by
strength.plot() (thanks Fabio Gori).
* information about illegal arcs in now preserved in averaged.network(), so
that cpdag() works correctly on the returned network.
* loss function "pred" (classification error, predicted values from parents)
is now distinct from "pred-exact" (classification error, exact posterior
predicted values for classifiers); and it is now possible to use "pred" and
"pred-lw" in bn.cv() when the model is a BN classifier (thanks Kostas
Oikonomou).
* graphviz.compare() now returns a list containing the graph objects
corresponding to the networks provided as arguments (thanks William Raynor).
* the "from-first" method in graphviz.compare() now has a "show.first"
argument that controls whether the reference network is plotted at all
(thanks William Raynor).
* implemented the IAMB-FDR, HPC and H2PC structure learning algorithms.
* reimplemented the BGe score using the updated unbiased estimator from
Kuipers, Moffa and Heckerman (2014).
* fixed the test counter in constraint-based algorithms, which would overcount
in some cases.
* it is now possible to use any structure learning algorithm in bn.boot() and
bn.cv().
* fixed prediction from parents for conditional Gaussian nodes with no
continuous parents (thanks Harsha Kokel).
* it is now possible to use data with missing values in learn.mb(),
learn.nbr() and in all constraint-based structure learning algorithms.
* fixed tabu() in the presence of zero-variance continuous variables; the
search was not correctly initialized because the starting model is
singular (thanks Luise Gootjes-Dreesbach).
* implemented predictive log-likelihood scores for discrete, Gaussian and
conditional Gaussian networks.
* fixed an integer overflow in the nparams() method for bn.fit objects
(thanks Yujian Liu).
* make conditional sampling faster for large conditional probability tables
(thanks Yujian Liu).
* preserve structure learning information in bn.cv(), so that
custom.strength() can get directions right from the resulting set of
networks (thanks Xiang Liu).
* revised the preprocessing of whitelists and blacklists, and clarified the
documentation (thanks Michail Tsagris).
* added a "for.parents" argument to coef() and sigma() to make them return
the parameters associated with a specific configuration of the discrete
parents of a node in a bn.fit object (thanks Harsha Kokel).
* fixed segfault in predict(..., method = "bayes-lw") from data that contain
extra variables that are not in the network (thanks Oliver Perkins).
bnlearn (4.4)
* fixed pc.stable() v-structure detection in the presence of blacklisted arcs.
* warn about trying to cextend() networks that contain no information about
arc directions (and thus v-structures), such as those learned with
"undirected = TRUE" or those returned by skeleton().
* fixed a bug because of which a number of functions incorrectly reported
that data had variables with no observed values when that was not true.
* fixed posterior imputation from a single observed variable (thanks Derek
Powell).
* added an argument "max.sx" to limit the maximum allowed size of the
conditioning sets in the conditional independence tests used in
constraint-based algorithms and in learn.{mb,nbr}().
* do not generate an error when it is impossible to compute a partial
correlation because the covariance matrix cannot be (pseudo)inverted;
generate a warning and return a zero partial correlation instead.
* added an as.lm() function to convert Gaussian networks and nodes to (lists
of) lm objects (thanks William Arnost).
* fixed the penalty terms of BIC and AIC, which did not count residual
standard errors when tallying the parameters of Gaussian and conditional
Gaussian nodes.
* cpdag() failed to set the directions of some compelled arcs when both
end-nodes have parents (thanks Topi Talvitie).
* custom.strength() now accepts bn.fit objects in addition to bn objects
and arc sets.
* vstructs() mistakenly handled moral = TRUE as if it were moral = FALSE
(thanks Christian Schuhegger).
* graphviz.plot() and strength.plot() now have a "render" argument that
controls whether a figure is produced (a graph object is always returned
from both functions).
* graphviz.plot(), strength.plot() and graphviz.compare() now have a "groups"
argument that specifies subsets of nodes that should be plotted close to
each other, layout permitting.
* fixed tree.bayes() for data frames with 2-3 variables, and chow.liu() as
well (thanks Kostas Oikonomou).
bnlearn (4.3)
* the "strict" and "optimized" arguments of constraint-based algorithms are
now deprecated and will be removed at the beginning of 2019.
* the relevant() function is now deprecated, and it will also be removed
at the beginning of 2019.
* improved and fixed a few bugs in the functions that import and export
bn and bn.fit objects to the graph package.
* fixed a bug in averaged.network(), which could result in inconsistent bn
objects when arcs were dropped to obtain an acyclic graph (thanks Shuonan
Chen).
* added a graphviz.chart() function to produce DAG-with-barchart-nodes plots.
* fixed the counting of the number of parameters of continuous and hybrid
networks, which did not take the residual standard errors into account
(thanks Jeffrey Hart).
* improved handling of singular models in impute().
* added an import function for pcAlgo objects from pcalg.
* fixed bug in the sanitization of conditional Gaussian networks (thanks
Kostas Oikonomou).
* added a loss() function to extract the estimated loss values from the
objects returned by bn.cv() (thanks Dejan Neskovic).
* it is now possible to use data with missing values in bn.fit() and
nparams().
* added a "replace.unidentifiable" argument to bn.fit(..., method = "mle"),
to replace parameter estimates that are NA/NaN with zeroes (for
regression coefficients) and uniform probabilities (in conditional
probability tables).
* added a bf.strength() function to compute arc strengths using Bayes
factors.
* learn.{mb,nbr}() now work even if all nodes are blacklisted.
* assigning singular models from lm() to nodes in a bn.fit object will now
zapsmall() near-zero coefficients, standard errors and residuals to match
the estimates produced by bn.fit().
* bn.cv() now supports performing multiple runs with custom folds (different
for each run).
* improved sanitization in mutilated(), and updated its documentation.
* removed the bibTeX file with references, available at www.bnlearn.com.
* implemented the stable version of the PC algorithm.
* added a count.graph() function that implements a number of graph enumeration
results useful for studying graphical priors.
* fixed loss estimation in bn.cv() for non-extendable partially directed
graphs, now errors are produced instead of returning meaningless results
(thanks Derek Powell).
bnlearn (4.2)
* added a tuning parameter for the inclusion probability to the marginal
uniform graph prior.
* added a Bayesian Dirichlet score using Jeffrey's prior (from Joe Suzuki).
* allow fractional imaginary sample sizes for posterior scores.
* allow imaginary sample sizes in (0, 1] for discrete posterior scores,
to explore asymptotic results.
* set the default imaginary sample size for discrete networks to 1, following
recommendations from the literature.
* moral(), cpdag(), skeleton() and vstructs() now accept bn.fit objects in
addition to bn objects.
* fixed a segfault in cpdist(..., method = "lw") caused by all weights
being equal to NaN (thanks David Chen).
* changed the default value of the "optimized" argument to "FALSE" in
constraint-based algorithms.
* changed the arguments of mmhc() and rsmax2() to improve their flexibility
and to allow separate "optimized" values for the restrict and maximize
phases.
* fixed sanitization of fitted networks containing ordinal discrete
variables (thanks David Chen).
* improved argument sanitization in custom.fit() and model string functions.
* added a BF() function to compute Bayes factors.
* added a graphviz.compare() function to visually compare network structures.
* implemented the locally averaged Bayesian Dirichlet score.
* custom.strength() now accepts bn.kcv and bn.kcv objects and computes arc
strengths from the networks learned by bn.cv() in the context of
cross-validation.
* fixed multiple bugs in cextend() and cpdag() that could result in the
creation of additional v-structures.
* implemented the Structural EM algorithm in structural.em().
* fixed multiple bugs triggered by missing values in predict() (thanks
Oussama Bouldjedri).
* implemented an as.prediction() function that exports objects of class
bn.strength to the ROCR package (contributed by Robert Ness).
bnlearn (4.1)
* fixed memory corruption in dsep() (thanks Dominik Muller).
* added the marginal uniform prior.
* fixed the optimized score cache for the Castelo & Siebes and for the
marginal uniform priors, which were affected by several subtle bugs.
* bn.cv() now implements a "custom-folds" method that allows to manually
specify which observation belongs to each fold, and folds are not
constrained to have the same size.
* fixed checks in the C code involving R objects' classes; they failed
when additional, optional classes were present (thanks Claudia Vitolo).
* fixed cpdag() handling of illegal arcs that are part of shielded
colliders (thanks Vladimir Manewitsch).
* removed misleading warning about conflicting v-structures from cpdag().
* rsmax2() and mmhc() now return whitelists and blacklists as they are
at the beginning restrict phase (thanks Vladimir Manewitsch).
* bn.fit() can now fit local distributions in parallel, and has been mostly
reimplemented in C for speed (thanks Claudia Vitolo).
* added an impute() function to impute missing values from a bn.fit object.
* fixed loss functions for data in which observations have to be dropped
for various nodes (thanks Manuel Gomez Olmedo).
* added an all.equal() method to compare bn.fit objects.
* added a "by.node" argument to score() for decomposable scores (thanks
Behjati Shahab).
* added warning about partially direct graphs in choose.direction() and
improved its debugging output (thanks Wei Kong).
* added spouses(), ancestors() and descendats().
* fixed a segfault in predict(..., method = "lw") with discrete BNs and
sparse CPTs that included NaNs.
bnlearn (4.0)
* fixed memory usage in aracne(), chow.liu() and tree.bayes() (thanks
Sunkyung Kim).
* rework memory management using calloc() and free() to avoid memory
leaks arising from R_alloc() and missing memory barriers.
* fixed a coefficients indexing bug in rbn() for conditional Gaussian
nodes (thanks Vladimir Manewitsch).
* added a mean() function to average bn.strength objects.
* fixed S4 method creation on package load on MacOS X (thanks Dietmar
Janetzko)
* fixed more corner cases in the Castelo & Siebes prior, and increased
numeric tolerance for prior probabilities.
* allow non-uniform priors for the "mbde" score (thanks Robert Ness)
and for "bdes".
* the "mode" attribute in bn.strength objects it now named "method".
* added posterior probabilities to the predictions for all discrete
networks (thanks ShangKun Deng).
* added the Steck's optimal ISS estimator for the BDe(u) score.
* fixed the assignment of standard deviation in fitted CLG networks
(thanks Rahul Swaminathan).
* handle zero lambdas in the shrinkage Gaussian mutual information
(thanks Piet Jones).
* fixed segfault when computing posterior predictions from networks with
NaNs in their conditional probability tables (thanks Giulio Caravagna).
* fixed the assignment of LASSO models from the penalized package to
fitted Gaussian networks (thanks Anthony Gualandri).
* cpdag() now preserves the directions of arcs between continuous and
discrete nodes in conditional linear Gaussian networks, and optionally
also takes whitelists and blacklist into account (for any network).
* several checks are now in place to prevent the inclusion of illegal
arcs in conditional Gaussian networks.
* renamed the "ignore.cycles" argument to "check.cycles" in arcs<-() and
amat<-() for consistency with other functions such as set.arc().
* added an "undirected" argument to mmpc() and si.hiton.pc(), which can now
learn the CPDAG of the network instead of just the skeleton.
* added a "directed" argument to acyclic().
* removed unsupported argument "start" from learn.nbr().
* handle interventions correctly in boot.strength() when using the mixed
BDe score (thanks Petros Boutselis).
* "bdes" is now named "bds" (it is not score equivalent, so the "e" did
not belong).
bnlearn (3.9)
* fixed alpha threshold truncation bug in conditional independence tests
(thanks Janko Tackmann).
* massive cleanup of the C code handling conditional independence tests.
* fixed variance scaling bug for the mi-cg test (thanks Nicholas Mitsakakis).
* in the exact t-test for correlation and in Fisher's Z, assume independence
instead of returning an error when degrees of freedom are < 1.
* fixed segfault in cpdist(..., method = "lw") when the evidence has
probability zero.
* added loss functions based on MAP predictions in bn.cv().
* removed bn.moments() and bn.var(), they were basically unmaintained and had
numerical stability problems.
* added support for hold-out cross-validation in bn.cv().
* added plot() methods for comparing the results of different bn.cv() calls.
* permutation tests should return a p-value of 1 when one of the two
variables being tested is constant (thanks Maxime Gasse).
* improved handling of zero prior probabilities for arcs in the Castelo &
Siebes prior, so that hc() and tabu() do not get stuck (thanks Jim Metz).
* added an "effective" argument to compute the effective degrees of freedoms
of the network, estimated with the number of non-zero free parameters.
* fixed optional argument handling in rsmax2().
* fixed more corner cases related to singular models in
cpdist(..., method = "lw") and predict(..., method = "bayes-lw").
* fixed Pearson's X^2 test, zero cells may have dropped too often in
sparse contingency tables.
* fixed floating point rounding issues in the shrinkage estimator for the
Gaussian mutual information.
bnlearn (3.8.1)
* fixed CPT import in read.net().
* fixed penfit objects import from penalized (thanks John Noble).
* fixed memory allocation corner case in BDe.
bnlearn (3.8)
* reorder CPT dimensions as needed in custom.fit() (thanks Zheng Zhu).
* fixed two uninitialized-memory bugs found by valgrind, one in
predict() and one random.graph().
* fixed wrong check for cluster objects (thanks Vladimir Manewitsch).
* fixed the description of the alternative hypothesis for the
Jonckheere-Terpstra test.
* allow undirected cycles in whitelists for structure learning algorithms
and let the algorithm learn arc directions (thanks Vladimir Manewitsch).
* include sanitized whitelists (as opposed to those provided by the user)
in bn.fit objects.
* removed predict() methods for single-node objects, use the method for
bn.fit objects instead.
* various fixes in the monolithic C test functions.
* fixed indexing bug in compare() (thanks Vladimir Manewitsch).
* fixed false positives in cycle detection when adding edges to a graph
(thanks Vladimir Manewitsch).
* fixed prior handling in predict() for naive Bayes and TAN classifiers
(thanks Vinay Bhat).
* added configs() to construct configurations of discrete variables.
* added sigma() to extract standard errors from bn.fit objects.
bnlearn (3.7.1)
* small changes to make CRAN checks happy.
bnlearn (3.7)
* fixed the default setting for the number of particles in cpquery()
(thanks Nishanth Upadhyaya).
* reimplemented common test patterns in monolithic C functions to speed
up constraint-based algorithms.
* added support for conditional linear Gaussian (CLG) networks.
* fixed several recursion bugs in choose.direction().
* make read.{bif,dsc,net}() consistent with the `$<-` method for bn.fit
objects (thanks Felix Rios).
* support empty networks in read.{bif,dsc,net}().
* fixed bug in hc(), triggered when using both random restarts and the
maxp argument (thanks Irene Kaplow).
* correctly initialize the Castelo & Siebes prior (thanks Irene Kaplow).
* change the prior distribution for the training variable in classifiers
from the uniform prior to the fitted distribution in the
bn.fit.{naive,tan} object, for consistency with gRain and e1071 (thanks
Bojan Mihaljevic).
* note AIC and BIC scaling in the documentation (thanks Thomas Lefevre).
* note limitations of {white,black}lists in tree.bayes() (thanks Bojan
Mihaljevic).
* better input sanitization in custom.fit() and bn.fit<-().
* fixed .Call stack imbalance in random restarts (thanks James Jensen).
* note limitations of predict()ing from bn objects (thanks Florian Sieck).
bnlearn (3.6)
* support rectangular nodes in {graphviz,strength}.plot().
* fixed bug in hc(), random restarts occasionally introduced cycles in
the graph (thanks Boris Freydin).
* handle ordinal networks in as.grain(), treat variables as categorical
(thanks Yannis Haralambous).
* discretize() returns unordered factors for backward compatibility.
* added write.dot() to export network structures as DOT files.
* added mutual information and X^2 tests with adjusted degrees of freedom.
* default vstruct() and cpdag() to moral = FALSE (thanks Jean-Baptiste
Denis).
* implemented posterior predictions in predict() using likelihood weighting.
* prevent silent reuse of AIC penalization coefficient when computing BIC
and vice versa (thanks MarĂa Luisa Matey).
* added a "bn.cpdist" class and a "method" attribute to the random data
generated by cpdist().
* attach the weights to the return value of cpdist(..., method = "lw").
* changed the default number of simulations in cp{query, dist}().
* support interval and multiple-valued evidence for likelihood weighting
in cp{query,dist}().
* implemented dedup() to pre-process continuous data.
* fixed a scalability bug in blacklist sanitization (thanks Dong Yeon Cho).
* fixed permutation test support in relevant().
* reimplemented the conditional.test() backend completely in C for
speed, it is now called indep.test().
bnlearn (3.5)
* fixed (again) function name collisions with the graph packages
(thanks Carsten Krueger).
* fixed some variable indexing issues in likelihood weighting.
* removed bootstrap support from arc.strength(), use boot.strength()
instead.
* added set.edge() and drop.edge() to work with undirected arcs.
* boot.strength() now has a parallelized implementation.
* added support for non-uniform graph priors (Bayesian variable
selection, Castelo & Siebes).
* added a threshold for the maximum number of parents in hc() and tabu().
* changed the default value of "moral" from FALSE to TRUE in cpdag()
and vstructs() to ensure sensible results in model averaging.
* added more sanity checks in cp{query,dist}() expression parsing
(thanks Ofer Mendelevitch).
* added 'nodes' and 'by.sample' arguments to logLik() for bn.fit objects.
* support {naive,tree}.bayes() in bn.cv() (thanks Xin Zhou).
* fixed predict() for ordinal networks (thanks Vitalie Spinu).
* fixed zero variance handling in unconditional Jonckheere-Terpstra
tests due to empty rows/columns (thanks Vitalie Spinu).
* in bn.cv(), the default loss for classifiers is now classification
error.
* added a nodes<-() function to re-label nodes in bn and bn.fit object
(based on a proof of concept by Vitalie Spinu).
* replaced all calls to LENGTH() with length() in C code (thanks Brian
Ripley).
* default to an improper flat prior in predict() for classifiers for
consistency (thanks Xin Zhou).
* suggest the parallel package instead of snow (which still works fine).
bnlearn (3.4)
* move the test counter into bnlearn's namespace.
* include Tsamardinos' optimizations in mmpc(..., optimized = FALSE),
but not backtracking, to make it comparable with other learning
algorithms.
* check whether the residuals and the fitted values are present
before trying to plot a bn.fit{,.gnode} object.
* fixed two integer overflows in factors' levels and degrees of
freedom in large networks.
* added {compelled,reversible}.arcs().
* added the MSE and predictive correlation loss functions to bn.cv().
* use the unbiased estimate of residual variance to compute the
standard error in bn.fit(..., method = "mle") (thanks
Jean-Baptiste Denis).
* revised optimizations in constraint-based algorithms, removing
most false positives by sacrificing speed.
* fixed warning in cp{dist,query}().
* added support for ordered factors.
* implemented the Jonckheere-Terpstra test to support ordered
factors in constraint-based structure learning.
* added a plot() method for bn.strength objects containing
bootstrapped confidence estimates; it prints their ECDF and
the estimated significance threshold.
* fixed dimension reduction in cpdist().
* reimplemented Gaussian rbn() in C, it's now twice as fast.
* improve precision and robustness of (partial) correlations.
* remove the old network scripts for network that are now available
from www.bnlearn.com/bnrepository.
* implemented likelihood weighting in cp{dist,query}().
bnlearn (3.3)
* fixed cpdag() and cextend(), which returned an error about
the input graph being cyclic when it included the CPDAG of
a shielded collider (thanks Jean-Baptiste Denis).
* do not generate observations from redundant variables (those
not in the upper closure of event and evidence) in cpdag()
and cpquery().
* added Pena's relevant() nodes identification.
* make custom.fit() robust against floating point errors
(thanks Jean-Baptiste Denis).
* check v-structures do not introduce directed cycles in the
graph when applying them (thanks Jean-Baptiste Denis).
* fixed a buffer overflow in cextend() (thanks Jean-Baptiste
Denis).
* added a "strict" argument to cextend().
* removed Depends on the graph package, which is in Suggests
once more.
* prefer the parallel package to snow, if it is available.
* replace NaNs in bn.fit objects with uniform conditional
probabilities when calling as.grain(), with a warning
instead of an error.
* remove reserved characters from levels in write.{dsc,bif,net}().
* fix the Gaussian mutual information test (thanks Alex Lenkoski).
bnlearn (3.2)
* fixed outstanding typo affecting the sequential Monte Carlo
implementation of Pearson's X^2 (thanks Maxime Gasse).
* switch from Margaritis' set of rules to the more standard
Meek/Spirtes set of rules, which are implemented in cpdag().
Now the networks returned by constraint-based algorithms are
guaranteed to be CPDAGs, which was not necessarily the case
until now.
* semiparametric tests now default to 100 permutations, not 5000.
* make a local copy of rcont2() to make bnlearn compatible with
both older and newer R versions.
bnlearn (3.1)
* fixed all.equal(), it did not work as expected on networks
that were identical save for the order of nodes or arcs.
* added a "moral" argument to cpdag() and vstructs() to make
those functions follow the different definitions of v-structure.
* added support for graphs with 1 and 2 nodes.
* fixed cpquery() handling of TRUE (this time for real).
* handle more corner cases in dsep().
* added a BIC method for bn and bn.fit objects.
* added the semiparametric tests from Tsamardinos & Borboudakis
(thanks Maxime Gasse).
* added posterior probabilities to the predictions for
{naive,tree}.bayes() models.
* fixed buffer overflow in rbn() for discrete data.
bnlearn (3.0)
* added dsep() to test d-separation.
* implemented Tree-Augmented Naive Bayes (TAN) in tree.bayes().
* implemented Semi-Interleaved HITON-PC in si.hiton.pc().
* improved parsing in read.{bif,dsc,net}().
* fixed an indexing error in Gaussian permutation tests.
* added a "textCol" highlight option to graphviz.plot().
* added {incoming,outgoing,incident}.arcs().
* fixed two hard-to hit bugs in TABU search (thanks Maxime Gasse).
* added custom.fit() for expert parameter specification.
* added support for Markov blanket preseeding in learn.mb().
* assume independence instead of returning an error when a
partial correlation test fails due to errors in the
computation of the pseudoinverse of the covariance matrix.
* fixed an incorrect optimization in the backward phase of mmpc().
* fixed a floating point rounding problem in the mutual
information tests (64bit OSes only, thanks Maxime Gasse).
* fixed cp{query,dist}() handling of TRUE (thanks Maxime Gasse).
* added learn.nbr() to complement learn.mb().
bnlearn (2.9)
* integrate with the graph package and provide import/export
functions for the graphNEL and graphAM classes.
* removed bn.var.test() and the "aict" test.
* fixed wrong ISS handling in the BDe score.
* fixed a buffer overflow in the BGe score.
* added subgraph(), tiers2blacklist(), and cextend().
* fixed bug in boot.strength(), which failed with a spurious
error when a PDAG was learned.
* support interval discretization as the initial step in
Hartemink's discretization.
* fixed blacklist handling in chow.liu().
* fixed choose.direction() and arc.strength(), both of them
require a .test.counter and should create it as needed.
* added as.bn() and as.bn.fit() for "grain" objects (from the
the gRain package) and as.grain() for "bn.fit" objects.
* fixed infinte loop in choose.direction().
* make choose.direction(..., criterion = "bootstrap") work again.
* added an 'every' argument to random.graph() for the 'ic-dag'
and 'melancon' algorithms.
* shortened the optional arguments for random.graph(...,
method = "hartemink") to "idisc" and "ibreaks".
bnlearn (2.8)
* switched "cpdag" to TRUE in {boot,custom}.strength().
* added a "weights" argument to custom.strength().
* implemented the modified BDe score handling mixtures of
experimental and observational data (mbde).
* reimplemented the BGe score for speed (about 20x faster).
* fixed a buffer overflow in predict() for discrete networks.
* fixed sanitization in predict() for bn.fit.{d,g}node objects.
* handle partially directed graphs in bn.cv() for log-likelihood
loss (both discrete and Gaussian).
bnlearn (2.7)
* make .onLoad() more robust, so that it passes "R CMD check"
even with a broken Graphviz installation.
* reduced memory usage in graphviz.plot(), now using arc lists
instead of adjacency matrices.
* added tie breaking in prediction.
* allow inter.iamb() to break infinite loops instead of returning
an error.
* fixed a buffer overflow in discrete Monte Carlo tests.
* added sequential Monte Carlo permutation tests.
* improved performance and error handling in Gaussian Monte
Carlo tests.
bnlearn (2.6)
* allow discrete data in Hartemink's discretization algorithm.
* implemented {read,write}.bif() to import/export BIF files.
* implemented {read,write}.dsc() to import/export DSC files.
* implemented {read,write}.net() to import/export NET files.
* completely reimplemented compare() to return useful metrics,
it was just a slower version of all.equal().
* implemented model averaging with significance thresholding
in averaged.network().
* use new significance thresholding in {boot,custom}.strength()
and arc.strength(criterion = "bootstrap").
* export predicted values in bn.cv() when using classification
error.
* fixed an integer overflow in mutual information tests.
bnlearn (2.5)
* reimplemented rbn.discrete() in C both for speed and to
get CPT indexing right this time.
* added bn.net() to complement bn.fit().
* changed the default value of the imaginary sample size to
10, which is a de facto standard in literature.
* implemented the ARACNE and Chow-Liu learning algorithms.
* improved robustness of correlation estimation.
* added a "cpdag" option to boot.strength().
* fixed bug in discretize().
* improved sanitization in graphviz.plot() and strength.plot().
* added Hamming distance.
bnlearn (2.4)
* reimplemented naive Bayes prediction in C for speed.
* added some debugging output to predict() methods.
* fixed printing of fitted Gaussian BNs.
* fixed stack imbalance in Gaussian Monte Carlo tests.
* implemented some discretization methods in discretize().
* added custom.strength() for arbitrary sets of networks.
* fixed strength.plot() threshold for significant arcs.
bnlearn (2.3)
* added cpdist() to generate observations from arbitrary
conditional probability distributions.
* added a simple naive.bayes() implementation for discrete
networks, complete with a predict() implementation using
maximum posterior probability.
* added the shrinkage test for the Gaussian mutual information.
* added ntests(), in.degree(), out.degree(), degree(),
whitelist() and blacklist().
* added support for the snow package in bn.boot(), bn.cv(),
cpquery() and cpdist().
* fixed integer overflow in the computation of the number of
parameters of discrete networks.
* fixed errors in the labels of Gaussian scores.
bnlearn (2.2)
* fixed a bug in moral(), which returned duplicated arcs when
shielded parents were present in the graph.
* implemented all.equal() for bn objects.
* added workaround for plotting empty graphs with graphviz.plot(),
which previously generated an error in Rgraphviz.
* added a CITATION file.
* added the narcs() function.
* print.bn()'s now supports small(er) line widths.
* added support for "bn.fit" objects in graphviz.plot().
* added support for changing the line type of arcs in
graphviz.plot().
* added the learn.mb() function to learn the Markov blanket of
a single variable.
* fixed calls to UseMethod(), which were not always working
correctly because of the changed parameter matching.
* fixed an off-by-one error in the prediction for discrete
root nodes.
bnlearn (2.1)
* optimized data frame subsetting and parents' configurations
construction in conditional.test() and score.delta().
* fixed and improved the performance of rbn().
* fixed wrong penalization coefficient for Gaussian AIC (was
computing a Gaussian BIC instead).
* added cpquery() to perform conditional probability queries
via Logic (Rejection) Sampling.
* added bn.cv() to perform k-fold cross-validation, with
expected log-likelihood and classification error as
loss functions.
* added predict(), logLik() and AIC() methods for bn.fit
objects.
* renamed bnboot() to bn.boot() for consistency with bn.cv()
and bn.fit().
bnlearn (2.0)
* added the shd() distance.
* renamed dag2ug() to skeleton(), which is more intuitive.
* added support for "bn.fit" objects in rbn().
* added vstructs() and moral().
* added the coronary data set.
* improved partial correlation resillience to floating point
errors when dealing with ill-behaved covariance matrices.
* miscellaneous (small) optimizations in both R and C code.
bnlearn (1.9)
* added support for "bn.fit" objects in nodes(), nbr(),
parents(), children(), root.nodes(), leaf.nodes(),
arcs(), directed.arcs(), undirected.arcs(), amat(),
nparams(), mb(), path(), directed(), acyclic(),
node.ordering().
* fixed bug in hybrid and score-based learning algorithms,
which did not handle blacklists correctly.
bnlearn (1.8)
* removed the fast mutual information test in favour of
the equivalent shrinkage test, which uses a more
systematic approach.
* fixed fast.iamb(), which should not have truncated
exact and Monte Carlo tests.
* added the HailFinder and Insurance data sets.
* updated the Grow-Shrink implementation according to
newer (and much clearer) literature from Margaritis.
* rewritten more of the configuration() function in C,
resulting in dramatic (2x to 3x) speedups for large
data sets.
* implemented tabu search.
* removed rshc() in favour of rsmax2(), a general two-stage
restricted maximization hybrid learning algorithm.
* reimplemented cpdag() in C, with an eye towards a
future integration with constraints-based algorithms.
* fixed a bug in coef() for discrete bn.fit objects.
* implemented Melancon's uniform probability random DAG
generation algorithm.
bnlearn (1.7)
* big clean-up of C code, with some small optimizations.
* fixed bug in the handling of upper triangular matrices
(UPTRI3 macro in C code).
* added the dag2ug() and pdag2dag() functions.
* fixed a bug in bn.fit(), now it really works even for
discrete data.
* added bn.moments(), bn.var() and bn.var.test() for
basic probabilistic modelling of network structures.
bnlearn (1.6)
* implemented the mmhc() algorithm and its generic
template rshc().
* rewritten both the optimized and the standard implementation
of hc() in C, they are way faster than before.
* various fixes to documentation and bibtex references.
* revised the functions implementing the second half
of the constraint-based algorithm.
* improved parameter sanitization in "amat<-"().
* fixed the functions that set arcs' direction in
constraint-based algorithms.
bnlearn (1.5)
* improved parameter sanitization in the "<-"()
functions and modelstring().
* added support for bootstrap inference with bnboot(),
boot.strength(), arc.strength(, criterion = "bootstrap")
and choose.direction(, criterion = "bootstrap").
* fixed a bug in acyclic() causing false negatives.
* added bn.fit() for estimating the parameters of a Bayesian
network conditional on its structure.
* mapped some S3 methods (print, fitted, fitted.values,
residuals, resid, coefs, coefficients) to objects of
class "bn.fit", "bn.fit.gnode" and "bn.fit.dnode".
* added some plots for the fitted models based on the
lattice package.
* implemented AIC and BIC for continuous data, and
removed the likelihood score.
* various optimizations to C code.
* throughout documentation update.
* fixed an infinite loop bug in inter.iamb().
bnlearn (1.4)
* exported the "B" parameter to specify the number of
permutations to be done in a permutation test.
* removed the "direction" parameter from constraint-based
learning algorithms, as it was non-standard,
misnamed and had various reported problems.
* removed the duplicate "dir" label for the BDe score.
* added support for Gaussian data to rbn() and nparams().
* added "modelstring<-"().
* revised references in documentation.
* added the alarm and marks data sets.
* moved the scripts to generate data from the networks
included as data sets to the "network.scripts"
directory.
bnlearn (1.3)
* added Monte Carlo permutation tests for mutual
information (for both discrete and Gaussian data),
Pearson's X^2, linear correlation and Fisher's Z.
* added strength.plot().
* reimplemented random.graph() in C for speed.
* clean up of C memory allocation functions.
bnlearn (1.2)
* added cache.partial.structure() to selectively
update nodes' cached information stored in
'bn' objects.
* fixed a bug in cache.structure().
* reimplemented is.acyclic() in C to fix a bug
causing false negatives.
* added the lizards data set.
bnlearn (1.1)
* implemented mmpc().
* slightly changed gaussian.test to be more learning-friendly.
* fixed bugs in empty.graph() and "arcs<-"().
* changed the default probability of arc inclusion for
the "ordered" method in random.graph() to get sparser
graphs.
* added graphviz.plot().
* implemented the possibility of not learning arc directions
in constraint-based algorithms.
* changed the default value of the strict parameter
from TRUE to FALSE.
* reimplemented cache.structure() in C to increase
random.graph() performance and scalability.
bnlearn (1.0)
* completely rewritten random.graph(); now it supports
different generation algorithms with custom tuning
parameters.
* moved to dynamic memory allocation in C routines.
* improved performance and error handling of rbn().
bnlearn (0.9)
* reimplemented all the functions that deal with cycles
and paths in C, which increased their speed manifold
and greatly improved their memory use.
* cycle detection and elimination snow parallelized in
gs(), iamb(), fast.iamb() and inter.iamb().
* renamed {root,leaf}nodes() to {root,leaf}.nodes().
* rewritten node ordering in C to improve performance
and avoid recursion.
* added ci.test(), which provides a fronted to all the
independence and conditional independence tests
implemented in the package.
* added mutual information (for Gaussian data) and
Pearson's X^2 tests (for discrete data).
* removed the Mantel-Haenszel test.
bnlearn (0.8)
* added support for random restarts in hc().
* added arc.strength(), with support for both conditional
independence tests and network scores.
* added the asia data set.
* lots of documentation updates.
* reimplemented functions related to undirected arcs in C
for speed.
* added more parameter sanitization.
bnlearn (0.7)
* optimized hc() via score caching, score equivalence,
and partial reimplementation in C.
* many utility functions' backends reimplemented in C
for speed.
* improved cycle and path detection.
* lots of documentation updates.
* added more parameter sanitization.
bnlearn (0.6)
* implemented hc().
* added support for the K2 score for discrete networks.
* ported Gaussian posterior density from the deal package.
* added the gaussian.test data set.
* added an AIC-based test for discrete data.
* lots of documentation updates.
* added more parameter sanitization.
bnlearn (0.5)
* added more utility functions, such as rootnodes(),
leafnodes(), acyclic(), empty.graph() and random.graph().
* reimplemented parents' configuration generation in C
for speed.
* lots of documentation updates.
* added lots of parameter sanitization in utils-sanitization.R.
bnlearn (0.4)
* added rbn(), with support for discrete data.
* added a score function, with support for likelihood,
log-likelihood, AIC, BIC, and posterior Dirichlet
density of discrete networks.
* ported modelstring(), a string representation of a network,
from package deal.
* added many utility functions, such as parents() and children()
and their counterparts "parents<-"() and "children<-"().
* lots of documentation updates.
bnlearn (0.3)
* added support for the snow package in gs(), iamb(), inter.iamb()
and fast.iamb().
* added the learning.test data set.
* reimplemented mutual information in C for speed.
* lots of documentation updates.
bnlearn (0.2)
* implemented iamb(), inter.iamb() and fast.iamb().
* added partial correlation and Fisher's Z conditional
independence tests for Gaussian data.
* first completely documented release.
bnlearn (0.1)
* initial release.
* preliminary implementation of gs() with mutual information
as conditional independence test.