简体   繁体   English

如何检索用于训练此 xgboost 助推器类型 model 的超参数?

[英]How can I retrieve the hyper-parameters that were used to train this xgboost booster type model?

I have an xgboost model that is trained already.我有一个已经训练过的 xgboost model。 It was trained by the xgboost original API. I am trying to find the hyper-parameters upon which the trained model was trained.它由 xgboost 原始 API 训练。我试图找到训练有素的 model 所依据的超参数。 Most specifically, I want to retrieve the objective of the trained model.最具体地说,我想检索受过训练的 model 的目标。

xgb.__versions__ # returns '1.7.2'
type(model) # returns xgboost.core.Booster
model.params() # AttibuteError: 'Booster' object has no attribute 'params'
model.get_params() # AttibuteError: 'Booster' object has no attribute 'get_params'

How can I retrieve the hyper-parameters that were used to train this xgboost booster type model?如何检索用于训练此 xgboost 助推器类型 model 的超参数?

You may find these of interest.您可能会发现这些感兴趣。

model.attributes()

model.num_features

model.feature_names

model.feature_types

Also, several methods for persisting a model can tell you about what it contains, including此外,持久化 model 的几种方法可以告诉您它包含的内容,包括

  • .save_config
  • .save_model
  • .save_raw

.get_dump() : https://xgboost.readthedocs.io/en/stable/python/python_api.html#xgboost.Booster.get_dump .get_dump() : https://xgboost.readthedocs.io/en/stable/python/python_api.html#xgboost.Booster.get_dump

... the output format is primarily used for visualization or interpretation, hence it's more human readable... ... output 格式主要用于可视化或解释,因此更易于阅读...


Let us know what you found!让我们知道您发现了什么!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何获取从 model 文件加载的 lightgbm Booster model 的参数? - How can I get the parameters of a lightgbm Booster model loaded from a model file? 如何从matplotlib中的循环绘制超参数? - How to plot hyper-parameters from loop in matplotlib? 在深度学习中是否可以在训练集的子集上进行训练以找到最佳超参数? - Is it possible in deep learning to train on a subset of training set in order to find the best hyper-parameters? 使用H2O中的Hyper参数在Sklearn中重新构建XGBoost可以在Python中实现差异性能 - Using Hyper-parameters from H2O to re-build XGBoost in Sklearn gives Difference Performance in Python 熊猫切入间隔超参数 - pandas cut into intervals hyper-parameters 优化决策树的超参数 - Optimize hyper-parameters of a decision tree SciKeras - 用于最佳超参数的 RandomizedSearchCV - SciKeras - RandomizedSearchCV for best hyper-parameters 有没有办法用 TRAINS python 包创建一个比较超参数与模型精度的图表? - Is there a way to create a graph comparing hyper-parameters vs model accuracy with TRAINS python package? 如何使用xgb.train指定xgboost模型中的树数? - How can I specify the number of the trees in my xgboost model ,using xgb.train? 如何将 RFE 与 xgboost Booster 一起使用? - How to use RFE with xgboost Booster?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM