简体   繁体   中英

How to do RFECV in scikit-learn with KFold, not StratifiedKFold?

from sklearn.cross_validation import StratifiedKFold, KFold
from sklearn.feature_selection import RFECV

rfecv = RFECV(estimator=LogisticRegression(), step=1, cv=StratifiedKFold(y, 10),
scoring='accuracy') 
rfecv.fit(X, y)

is an example to do RFECV with StratifiedKFold. The question is how to do RFECV with a normal KFold?

cv=KFold(y, 10) is not the answer since KFold and StratifiedKFold takes and returns a whole different values.

KFold(len(y), n_folds = n_folds) is the answer. So, for 10-fold it would be like

rfecv = RFECV(estimator=LogisticRegression(), step=1, cv=KFold(len(y),n_folds=10),
scoring='accuracy')

You could create your own CV strategy manually that mimics whatever KFold does:

def createCV():
    '''returns somthing like:

    custom_cv = [([0, 1, 2 ,3, 4, 5, 6], [7]), 
          ([0, 1, 2, 3, 4, 5], [6]), 
          ([0, 1, 2, 3, 4], [5]),
          ([0, 1, 2, 3], [4]),
          ([0, 1, 2], [3])] 
    where the 0th list element in each tuple is the training set, and the second is the test 
    '''

manual_cv  = createCV()
rfecv = RFECV(estimator=LogisticRegression(), step=1, cv=manual_cv,
scoring='accuracy') 

You could even use and rearrange what KFold would give you in createCV to suite your cv needs.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM