[英]XGBoost hyperparameter tunning with RandomizedSearchCV with multiple classes
I am doing a hyperparameter tunning for my model and my code is like this:我正在为我的模型进行超参数调整,我的代码是这样的:
para_tunning = {
'learning_rate': [0.01,0.05,0.1],
'min_child_weight': [1, 5, 10],
'gamma': [0.5, 1, 1.5, 2, 5],
'subsample': [0.6, 0.8, 1.0],
'colsample_bytree': [0.6, 0.8, 1.0],
'max_depth': [3, 4, 5, 6, 7, 8, 9, 10],
"n_estimators": [100, 200, 300, 400, 500],
"objective": "multi:softmax",
"aplha":[0,2,4,6,8]
}
clf_rndcv = RandomizedSearchCV(clf,
param_distributions = para_tunning,
cv = 5,
n_iter = 5,
scoring = 'accuracy',
error_score = 0,
verbose = 3,
n_jobs = -1,
random_state = 42)
clf_rndcv.fit(X_train, y_train)
It shows that Fitting 5 folds for each of 5 candidates, totalling 25 fits
, I suppose it just randomly pick 5 from the para_tunning dict and do a 5 fold cv?它表明
Fitting 5 folds for each of 5 candidates, totalling 25 fits
,我想它只是从 para_tunning dict 中随机选择 5 次并进行 5 次 cv? If I want to test all of the parameter do I switch over to gridsearchcv?如果我想测试所有参数,我是否切换到 gridsearchcv? And any suggestion for the tunning?
以及任何调整建议? I am doing a multiple class classifier with 100 classes, 500 sample per classes: so 50000 in total.
我正在做一个多类分类器,有 100 个类,每个类有 500 个样本:总共 50000 个。 Thanks!
谢谢!
Yes, if you want to search for ALL the hyperparameters you have to use GridSearchCV
.是的,如果您想搜索所有超参数,您必须使用
GridSearchCV
。 GridSearch
searches all possible combinations of the hyperparameters (it can be quite large given your case). GridSearch
搜索超GridSearch
所有可能组合(根据您的情况,它可能非常大)。
For your information, XGBoost has it own hyperparameters tuning.供您参考,XGBoost 有自己的超参数调整。 XGBoost CV
XGBoost 简历
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.