简体   繁体   中英

Passing input_dim to KerasClassifier (sklearn wrapper/interface)

I originally tried the same approach and ran into the same error as this SO questioner . However, using the accepted (and only) answer there gave me another error: "input_dim is not a legal parameter."

I then tried to use the solution ("add an input_dim keyarg to the KerasClassifier constructor") on the original question, and got the same error again. Am I doing something wrong, or is there a new way to pass a first layer's input_dim through the sklearn wrapper KerasClassifier now?

Minimal code example below:

from keras.models import Sequential
from keras.layers import Dense
from sklearn import datasets
from keras.wrappers.scikit_learn import KerasClassifier
import numpy as np


def create_model():
    # create model
    model = Sequential()
    model.add(Dense(12, input_dim=4, init='uniform', activation='relu'))
    model.add(Dense(6, init='uniform', activation='relu'))
    model.add(Dense(1, init='uniform', activation='sigmoid'))

    # Compile model
    model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

    return model

#Error thrown here:
model = KerasClassifier(build_fn=create_model, input_dim=5, nb_epoch=150, batch_size=10, verbose=0)

ValueError: input_dim is not a legal parameter

You need to pass the input_dim as one of the parameters to create_model()

def create_model(input_dim):
    # create model
    model = Sequential()
    # model.add(Dense(12, input_dim=4, init='uniform', activation='relu'))
    model.add(Dense(12, input_dim=input_dim, init='uniform', activation='relu'))

Though you need not use input_dim inside create_model for the sake of just removing the error.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM