简体   繁体   中英

deep learning with kfold cross validation

I am new into neural networks, I want to use K-fold cross-validation to train my neural network. I want to use 5 folds 50 epochs and a batch size of 64 I found a function in scikit for k-fold cross validation

model_selection.cross_val_score(model_kfold, x_train, y_train, cv=5)

and my code without cross validation is

history = alexNet_model.fit(x_train, y_train, batch_size=batch_size, epochs=epochs, verbose=1,validation_data=(x_validation, y_validation))

I don't know how to implement this with batch size and epochs in python using keras and scikit. any idea?

Be sure to use test data when validating your model, not the same training data. Using training data to validate will bias your results.

In your example, I would use the KerasClassifier module instead of the numpy KFolds module.

from keras.wrappers.scikit_learn import KerasClassifier

After you've imported the module, your code would be (with results output:

evaluator=KerasClassifier(build_fn=baseline_model, epochs=50, batch_size=64)
kfold=KFold(n_splits=5, shuffle=True, random_state=random_seed)

results=cross_val_score(evaluator, x_test, onehot_y_test, cv=kfold)
print("Model: %.2f%% (%.2F%%)" % (results.mean()*100, results.std()*100))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM