I want to ask, is it possible if the LSTM model using Keras can be continued with the Neural Network from scratch? here is my coding look like if i use full library instead:
visible = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedding = Embedding(len(word_index) + 1, EMBEDDING_DIM, weights = [embedding_matrix], input_length = MAX_SEQUENCE_LENGTH, trainable=False, name = 'embeddings')(visible)
lstm, states_h, states_c = LSTM(60, return_sequences=True, return_state=True, kernel_initializer="random_normal")(embedding)
pooling = GlobalMaxPool1D()(lstm)
hidden = Dense(10, activation='relu')(pooling)
output = Dense(2, activation='softmax')(hidden)
From the coding above, is it possible Dense() layer replaced with my own Neural Network from Scratch?
Your variable hidden
is not defined so I don't really know what you want to accomplish here, but I think you should be able to do something like this:
visible = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedding = Embedding(len(word_index) + 1, EMBEDDING_DIM, weights = [embedding_matrix], input_length = MAX_SEQUENCE_LENGTH, trainable=False, name = 'embeddings')(visible)
lstm, states_h, states_c = LSTM(60, return_sequences=True, return_state=True, kernel_initializer="random_normal")(embedding)
pooling = GlobalMaxPool1D()(lstm)
output = your_model(hidden)
You just have to make sure the shapes of your model are corresponding with hidden
EDIT: I didn't see you wanted to make a model from scratch. Then you have to make a custom keras layer, which is well explained here: https://keras.io/guides/making_new_layers_and_models_via_subclassing/
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.