简体   繁体   中英

Why is keras only doing 10 epochs when I set it to 300?

I'm using a combination of sklearn and Keras running with Theano as its back-end. I'm using the following code-

import numpy as np
import pandas as pd
from pandas import Series, DataFrame
import keras
from keras.callbacks import EarlyStopping, ModelCheckpoint
from keras.constraints import maxnorm
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras.optimizers import SGD
from keras.wrappers.scikit_learn import KerasClassifier
from keras.constraints import maxnorm
from keras.utils.np_utils import to_categorical
from sklearn.model_selection import cross_val_score
from sklearn.preprocessing import LabelEncoder
from sklearn.model_selection import StratifiedKFold
from sklearn.preprocessing import StandardScaler
from sklearn.pipeline import Pipeline
from sklearn.model_selection import train_test_split
from datetime import datetime
import time
from datetime import timedelta
from __future__ import division

seed = 7
np.random.seed(seed)

Y = data['Genre']
del data['Genre']
X = data

encoder = LabelEncoder()
encoder.fit(Y)
encoded_Y = encoder.transform(Y)

X = X.as_matrix().astype("float")

calls=[EarlyStopping(monitor='acc', patience=10), ModelCheckpoint('C:/Users/1383921/Documents/NNs/model', monitor='acc', save_best_only=True, mode='auto', period=1)]

def create_baseline(): 
    # create model
    model = Sequential()
    model.add(Dense(18, input_dim=9, init='normal', activation='relu'))
    model.add(Dense(9, init='normal', activation='relu'))
    model.add(Dense(12, init='normal', activation='softmax'))
    # Compile model
    sgd = SGD(lr=0.01, momentum=0.8, decay=0.0, nesterov=False)
    model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])
    return model

np.random.seed(seed)
estimators = []
estimators.append(('standardize', StandardScaler()))
estimators.append(('mlp', KerasClassifier(build_fn=create_baseline, nb_epoch=300, batch_size=16, verbose=2)))
pipeline = Pipeline(estimators)
kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=seed)
results = cross_val_score(pipeline, X, encoded_Y, cv=kfold, fit_params={'mlp__callbacks':calls})
print("Baseline: %.2f%% (%.2f%%)" % (results.mean()*100, results.std()*100))

The result when I start running this last part is-

Epoch 1/10
...
Epoch 2/10

etc.

It's supposed to be Epoch 1/300 and it works just fine when I run it on a different notebook.

What do you guys think is happening? np_epoch=300 ...

What Keras version is this? If its greater than 2.0, then nb_epoch was changed to just epochs. Else it defaults to 10.

In Keras 2.0 the nb_epoch parameter was renamed to epochs so when you set epochs=300 it runs 300 epochs. If you use nb_epoch=300 it will default to 10 instead.

Another solution to your problem : Forget about nb_epoch (doesn't work). Pass epochs inside fit_params:

results = cross_val_score(pipeline, X, encoded_Y, cv=kfold, 
          fit_params={'epochs':300,'mlp__callbacks':calls})

And that would work. fit_params goes straight into the Fit method and it will get the right epochs.

The parameter name in your function should be epochs instead of nb_epochs . Be very careful though. For example, I trained my ANN with the old fashioned way of declaring the parameters ( nb_epochs = number ), and it worked (the iPython console only showed me some warnings), but when I plugged the same parameter names in the cross_val_score function , it did not work.

I think that what sklearn calls "Epoch" is one step of your crossvalidation. So it does 300 epochs of training 10 times :-) is that possible? Try with verbose=1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM