简体   繁体   中英

mlr - Ensemble Models

The mlr package is great and idea of creating a ModelMultiplexer also helps. But the ModelMultiplexer " selects " 1 single model out of the models used.

Is there any support or planned support for creating a Bagged or Boosted Ensemble of the Individual Models?

bls = list(
  makeLearner("classif.ksvm"),
  makeLearner("classif.randomForest")
)
lrn = makeModelMultiplexer(bls)
ps = makeModelMultiplexerParamSet(lrn,
  makeNumericParam("sigma", lower = -10, upper = 10, trafo = function(x) 2^x),
  makeIntegerParam("ntree", lower = 1L, upper = 500L))
> print(res)
Tune result:
**Op. pars: selected.learner=classif.randomForest; classif.randomForest.ntree=197
mmce.test.mean=0.0333**

You have a few options for this in mlr . If you have a single model, you can use the BaggingWrapper :

lrn = makeLearner("classif.PART")
bag.lrn = makeBaggingWrapper(lrn, bw.iters = 50, bw.replace = TRUE, bw.size = 0.8, bw.feats = 3/4)

More details on this in the tutorial .

For several learners, you can use stacking :

base.learners = list(
  makeLearner("classif.ksvm"),
  makeLearner("classif.randomForest")
)
lrn = makeStackedLearner(base.learners, super.learner = NULL, predict.type = NULL,
  method = "stack.nocv", use.feat = FALSE, resampling = NULL,
  parset = list())

You can combine the predictions of the base learners using different methods, including fitting another learner on top of them. You can also combine this with bagging for the individual learners.

Boosting is supported in a number of the learners that mlr supports, see the list of all learners .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM