简体   繁体   English

估计器的参数 min_impurity_decrease 无效

[英]Invalid parameter min_impurity_decrease for estimator

DTC = DecisionTreeClassifier(max_depth=15)
RFC = RandomForestClassifier(max_depth=15)
BC = BaggingClassifier(base_estimator=DTC)
ADB = AdaBoostClassifier(base_estimator=DTC)

params = {
    "n_estimators": [10, 20, 100],
    'max_features': [1.0, 2.0, 3.0],
    "min_impurity_decrease": [1.0, 2.0, 3.0]
}

GSC1 = GridSearchCV(estimator=BC, param_grid=params, cv=5)
GSC1.fit(X_Train, Y_Train)

GSC_BC_Score = GSC1.score(X_Test, Y_Test)
Invalid parameter min_impurity_decrease for estimator BaggingClassifier(base_estimator=DecisionTreeClassifier(ccp_alpha=0.0, class_weight=None, criterion='gini', max_depth=15, max_features=None, max_leaf_nodes=None, min_impurity_decrease=0.0,                                                      min_impurity_split=None, min_samples_leaf=1,                                                    min_samples_split=2, min_weight_fraction_leaf=0.0,presort='deprecated',
random_state=None, splitter='best'), bootstrap=True, bootstrap_features=False, max_features=1.0, max_samples=1.0, n_estimators=10, n_jobs=None, oob_score=False, random_state=None, verbose=0, warm_start=False). 

Check the list of available parameters with estimator.get_params().keys().

In your case, you can check the keys, so for param input to DTC, these have a prefix base_estimator__ .在您的情况下,您可以检查键,因此对于 DTC 的参数输入,这些键具有前缀base_estimator__

BC.get_params().keys()

dict_keys(['base_estimator__ccp_alpha', 'base_estimator__class_weight', 'base_estimator__criterion', 'base_estimator__max_depth', 'base_estimator__max_features', 'base_estimator__max_leaf_nodes', 'base_estimator__min_impurity_decrease', 'base_estimator__min_impurity_split', 'base_estimator__min_samples_leaf', 'base_estimator__min_samples_split', 'base_estimator__min_weight_fraction_leaf', 'base_estimator__random_state', 'base_estimator__splitter', 'base_estimator', 'bootstrap', 'bootstrap_features', 'max_features', 'max_samples', 'n_estimators', 'n_jobs', 'oob_score', 'random_state', 'verbose', 'warm_start'])

So in your case, I assume you are passing them n estimators to the bagging and rest to decision tree, so it goes:因此,在您的情况下,我假设您将 n 个估计量传递给装袋并将其余部分传递给决策树,所以它是:

from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import BaggingClassifier
from sklearn.model_selection import GridSearchCV
from sklearn.datasets import make_classification

DTC = DecisionTreeClassifier(max_depth=15)
BC = BaggingClassifier(base_estimator=DTC)

X,y = make_classification()

params = {
    "n_estimators": [10, 20, 100],
    'base_estimator__max_features': [2, 5, 10],
    "base_estimator__min_impurity_decrease": [1.0, 2.0, 3.0]
}

GSC1 = GridSearchCV(estimator=BC, param_grid=params , cv=5)
GSC1.fit(X, y)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM