简体   繁体   中英

Rewriting ParamSet ids from mlr3::paradox()

Let's say I have the following ParamSet object:

my_ps = paradox::ps(
    minsplit = p_int(1, 64, logscale = TRUE),
    cp = p_dbl(1e-04, 1, logscale = TRUE))

Is it possible to rename minsplit to survTree.minsplit without changing anything else?

The reason for this is that I use some learners as part of a GraphLearner and so their parameters names changed and I would like to have some code that adds the learner$id in front the parameters to use later for tuning (rather than rewriting them from scratch with the new names)

I think I have a partial solution here. It is only partial, because it does not support the transformation.

Where it works:

library(paradox)

my_ps = paradox::ps(
  minsplit = p_int(1, 64),
  cp = p_dbl(1e-04, 1)
)

my_ps$set_id = "john"

my_psc = ParamSetCollection$new(list(my_ps))

print(my_psc)
#> <ParamSetCollection>
#>               id    class lower upper nlevels        default value
#> 1: john.minsplit ParamInt 1e+00    64      64 <NoDefault[3]>      
#> 2:       john.cp ParamDbl 1e-04     1     Inf <NoDefault[3]>

Created on 2022-12-07 by the reprex package (v2.0.1)

Where it does not:

library(paradox)

my_ps = paradox::ps(
  minsplit = p_int(1, 64, logscale = TRUE),
  cp = p_dbl(1e-04, 1)
)

my_ps$set_id = "john"

my_psc = ParamSetCollection$new(list(my_ps))
#> Error in .__ParamSetCollection__initialize(self = self, private = private, : Building a collection out sets, where a ParamSet has a trafo is currently unsupported!

Created on 2022-12-07 by the reprex package (v2.0.1)

The underlying problem is that we did not solve the problem of how to reconcile the parameter transformations of individual ParamSets and a possible parameter transformation of the ParamSetCollection

I fear that there is currently no neat solution for your problem.

Sorry I can not comment yet, this is not exactly the solution you are looking for but I hope this will fix the problem you are having.

You can set the param_space in the learner, before putting it in the graph, ie sticking with your search space. After you create the GraphLearner regularly it will have the desired search space.

A concrete example:

library(mlr3verse)

learner = lrn("regr.rpart", cp = to_tune(0.1, 0.2))

glrn = as_learner(po("pca") %>>% po("learner", learner))

at = auto_tuner(
  "random_search",
  glrn,
  rsmp("holdout"),
  term_evals = 10
)

task = tsk("mtcars")

at$train(task)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM