简体   繁体   中英

Laplace smoothig for Bayesian Netoworks in bnlearn

I'm trying to work with Bayesian Networks using R and currently I am using bnlearn framework. I'm trying to use score based structural learning from data and try different algorithms and approaches.

I would like to know if there is Laplace smoothing implemented in bnlearn or not. I could not find any information about it in the documentation. Am I missing somethings? Does anyone know?

No, it is not. However, this should be no problem as different priors are available in bnlearn and, unless you have some very specific reason to use Laplace smoothing, which is one particular prior, these should do.

Once you have a structure, you learn parameters with the bn.fit() function. Setting method = "bayes" uses Bayesian estimation and the optional argument iss determines the prior. The definition of iss : "the imaginary sample size used by the bayes method to estimate the conditional probability tables (CPTs) associated with discrete nodes".

As an example, consder a binary root node X in some network. bn.fit() returns (Nx + iss / cptsize) / (N + iss) as the probability of X = x , where N is your number of samples, Nx the number of samples with X = x , and cptsize the cardinality of X ; in this case cptsize = 2 because X is binary. Laplace correction would require that iss / cptsize always be equal to 1. Yet, bnlearn uses the same iss value for all CPTs and, iss / cptsize will only be 1 if all variables have the same cardinality. Thus, for binary variables, you could indeed have Laplace correction by setting iss = 2 . In the general case, however, it is not possible.

See bnlearn::bn.fit difference and calculation of methods "mle" and "bayes" for some additional details.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM