简体   繁体   中英

model.fit() Keras Classification Multiple Inputs-Single Output gives error: AttributeError: 'NoneType' object has no attribute 'fit'

I am constructing a Keras Classification model with Multiple Inputs (3 actually) to predict one single output. Specifically, my 3 inputs are:

  1. Actors
  2. Plot Summary
  3. Relevant Movie Features

Output :

  1. Genre tags

All the above inputs and the single output are related to 10,000 IMDB Movies.

Even though the creation of the model is successful, when I try to fit the model on my three different X_train's I get an Attribute error. I have one X_train and X_test for actors, a different one X_train and X_test for plot summary and a different one X_train and X_test for movie features. My y_train and y_test are the same for all the inputs.

Python Code (create the multiple input keras)

def kera_multy_classification_model():

    sentenceLength_actors = 15
    vocab_size_frequent_words_actors = 20001

    sentenceLength_plot = 23
    vocab_size_frequent_words_plot = 17501

    sentenceLength_features = 69
    vocab_size_frequent_words_features = 20001

    model = keras.Sequential(name='Multy-Input Keras Classification model')

    actors = keras.Input(shape=(sentenceLength_actors,), name='actors_input')
    plot = keras.Input(shape=(sentenceLength_plot,), name='plot_input')
    features = keras.Input(shape=(sentenceLength_features,), name='features_input')

    emb1 = layers.Embedding(input_dim = vocab_size_frequent_words_actors + 1,
                            # based on keras documentation input_dim: int > 0. Size of the vocabulary, i.e. maximum integer index + 1.
                            output_dim = Keras_Configurations_model1.EMB_DIMENSIONS,
                            # int >= 0. Dimension of the dense embedding
                            embeddings_initializer = 'uniform', 
                            # Initializer for the embeddings matrix.
                            mask_zero = False,
                            input_length = sentenceLength_actors,
                            name="actors_embedding_layer")(actors)
    encoded_layer1 = layers.LSTM(100)(emb1)

    emb2 = layers.Embedding(input_dim = vocab_size_frequent_words_plot + 1,
                            output_dim = Keras_Configurations_model2.EMB_DIMENSIONS,
                            embeddings_initializer = 'uniform',
                            mask_zero = False,
                            input_length = sentenceLength_plot,
                            name="plot_embedding_layer")(plot)
    encoded_layer2 = layers.LSTM(100)(emb2)

    emb3 = layers.Embedding(input_dim = vocab_size_frequent_words_features + 1,
                            output_dim = Keras_Configurations_model3.EMB_DIMENSIONS,
                            embeddings_initializer = 'uniform',
                            mask_zero = False,
                            input_length = sentenceLength_features,
                            name="features_embedding_layer")(features)
    encoded_layer3 = layers.LSTM(100)(emb3)

    merged = layers.concatenate([encoded_layer1, encoded_layer2, encoded_layer3])

    layer_1 = layers.Dense(Keras_Configurations_model1.BATCH_SIZE, activation='relu')(merged)

    output_layer = layers.Dense(Keras_Configurations_model1.TARGET_LABELS, activation='softmax')(layer_1)

    model = keras.Model(inputs=[actors, plot, features], outputs=output_layer)

    print(model.output_shape)

    print(model.summary())

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['sparse_categorical_accuracy'])

Python Code (fit the multiple input keras on my inputs)

def fit_keras_multy_input(model, x_train_seq_actors, x_train_seq_plot, x_train_seq_features, x_test_seq_actors, x_test_seq_plot, x_test_seq_features, y_train, y_test):

    s = time()

    fit_model = model.fit([x_train_seq_actors, x_train_seq_plot, x_train_seq_features], y_train, 
                          epochs=Keras_Configurations_model1.NB_EPOCHS,
                          verbose = Keras_Configurations_model1.VERBOSE,
                          batch_size=Keras_Configurations_model1.BATCH_SIZE,
                          validation_data=([x_test_seq_actors, x_test_seq_plot, x_test_seq_features], y_test),
                          callbacks=callbacks)

    duration = time() - s
    print("\nTraining time finished. Duration {} secs".format(duration))

Model's Structure

在此处输入图片说明

Error produced

在此处输入图片说明

Note: Please not that each of the X_train and X_test are sequences of numbers. (Text that have been tokenized)

Having done a little research, the problem starts in the model.compile() function. Although, I am not sure what should be changed in my model's compile function so as to fix this.

Thank you in advance for any advice or help on this matter. Feel free to ask on the comments any additional information that I may have missed, to make this question more complete.

Your function kera_multy_classification_model() doesn't return anything, so after model = kera_multy_classification_model() , you get model == None as your function returns nothing. And None 's type is NoneType and it really doesn't have a method called fit() .

Just add return model in the end of kera_multy_classification_model() .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM