繁体   English   中英

xgboost中的多类分类(python)

[英]multiclass classification in xgboost (python)

我无法弄清楚如何使用目标函数'multi:softmax'将类数或eval度量传递给xgb.XGBClassifier。

我查看了许多文档,但是关于sklearn包装器的唯一讨论接受了n_class / num_class。

我当前的设置看起来像

kf = cross_validation.KFold(y_data.shape[0], \
    n_folds=10, shuffle=True, random_state=30)
err = [] # to hold cross val errors
# xgb instance
xgb_model = xgb.XGBClassifier(n_estimators=_params['n_estimators'], \
    max_depth=params['max_depth'], learning_rate=_params['learning_rate'], \
    min_child_weight=_params['min_child_weight'], \
    subsample=_params['subsample'], \
    colsample_bytree=_params['colsample_bytree'], \
    objective='multi:softmax', nthread=4)

# cv
for train_index, test_index in kf:
    xgb_model.fit(x_data[train_index], y_data[train_index], eval_metric='mlogloss')
    predictions = xgb_model.predict(x_data[test_index])
    actuals = y_data[test_index]
    err.append(metrics.accuracy_score(actuals, predictions))

您不需要在scikit-learn API中为XGBoost分类设置num_class 它在调用fit时自动完成。 XGBClassifierfit方法的开头看xgboost / XGBClassifier

    evals_result = {}
    self.classes_ = np.unique(y)
    self.n_classes_ = len(self.classes_)

    xgb_options = self.get_xgb_params()

    if callable(self.objective):
        obj = _objective_decorator(self.objective)
        # Use default value. Is it really not used ?
        xgb_options["objective"] = "binary:logistic"
    else:
        obj = None

    if self.n_classes_ > 2:
        # Switch to using a multiclass objective in the underlying XGB instance
        xgb_options["objective"] = "multi:softprob"
        xgb_options['num_class'] = self.n_classes_

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM