简体   繁体   中英

python sklearn GradientBoostingClassifier warm start error

I've used the model to train a classifier on a set of data with 1000 iterations:

clf = GradientBoostingClassifier(n_estimators=1000, learning_rate=0.05, subsample=0.1, max_depth=3)
clf.fit(X, y, sample_weight=train_weight)

Now I want to increase the number of iterations to 2000. So I do:

clf.set_params(n_estimators=2000, warm_start=True)
clf.fit(X, y, sample_weight=train_weight)

But I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-13-49cfdfd6c024> in <module>()

      1 start = time.clock()
      2 clf.set_params(n_estimators=2000, warm_start=True)
----> 3 clf.fit(X, y, sample_weight=train_weight)
      4 ...

C:\Anaconda3\lib\site-packages\sklearn\ensemble\gradient_boosting.py in fit(self, X, y, sample_weight, monitor)
   1002                                     self.estimators_.shape[0]))
   1003             begin_at_stage = self.estimators_.shape[0]
-> 1004             y_pred = self._decision_function(X)
   1005             self._resize_state()
   1006 

C:\Anaconda3\lib\site-packages\sklearn\ensemble\gradient_boosting.py in _decision_function(self, X)
   1120         # not doing input validation.
   1121         score = self._init_decision_function(X)
-> 1122         predict_stages(self.estimators_, X, self.learning_rate, score)
   1123         return score
   1124 

sklearn/ensemble/_gradient_boosting.pyx in sklearn.ensemble._gradient_boosting.predict_stages (sklearn\ensemble\_gradient_boosting.c:2564)()

ValueError: ndarray is not C-contiguous

What am I doing wrong here?

warm_start is being used properly. There's actually a bug that's preventing this from working.

The workaround in the meantime is to copy the array to a C-contiguous array:

X_train = np.copy(X_train, order='C')
X_test = np.copy(X_test, order='C')

Reference: discussion and bug

You usually cannot modify sklearn classifier between fit calls and expect it to work. Number of estimators actually affect the size of internal objects of the model - thus it is not just a number of iterations (from programming point of view).

It seems to me, that the problem is that you did not pass warm_start=True to the constructor. If you do:

clf = GradientBoostingClassifier(n_estimators=1000, learning_rate=0.05, subsample=0.1, max_depth=3, warm_start=True)

you'll be able to fit additional estimators using:

clf.set_params(n_estimators=2000)
clf.fit(X, y, sample_weight=train_weight)

If it is not working may be you should try to update your sklearn version.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM