[英]Keras: functional API what should the Input layer be for the embedding layer?
我正在使用Keras功能API來創建一個神經網絡,它將單詞嵌入層作為句子分類任務的輸入。 但是我的代碼在連接輸入和嵌入層的開始就斷開了。 按照https://medium.com/tensorflow/predicting-the-price-of-wine-with-the-keras-functional-api-and-tensorflow-a95d1c2c1b03上的教程,我有如下代碼:
max_seq_length=100 #i.e., sentence has a max of 100 words
word_weight_matrix = ... #this has a shape of 9825, 300, i.e., the vocabulary has 9825 words and each is a 300 dimension vector
deep_inputs = Input(shape=(max_seq_length,))
embedding = Embedding(9825, 300, input_length=max_seq_length,
weights=word_weight_matrix, trainable=False)(deep_inputs) # line A
hidden = Dense(targets, activation="softmax")(embedding)
model = Model(inputs=deep_inputs, outputs=hidden)
然后,A行會導致以下錯誤:
ValueError: You called `set_weights(weights)` on layer "embedding_1" with a weight list of length 9825, but the layer was expecting 1 weights. Provided weights: [[-0.04057981 0.05743935 0.0109863 ..., 0.0072...
我真的不明白這個錯誤意味着什么......
似乎沒有正確定義輸入層...以前當我使用順序模型並且嵌入層定義完全相同時,一切正常。 但是當我切換到功能API時,我有這個錯誤。
任何幫助非常感謝,提前感謝
試試這個更新的代碼:你必須在嵌入層使用len(vocabulary) + 1
! 和weights=[word_weight_matrix]
max_seq_length=100 #i.e., sentence has a max of 100 words
word_weight_matrix = ... #this has a shape of 9825, 300, i.e., the vocabulary has 9825 words and each is a 300 dimension vector
deep_inputs = Input(shape=(max_seq_length,))
embedding = Embedding(9826, 300, input_length=max_seq_length,
weights=[word_weight_matrix], trainable=False)(deep_inputs) # line A
hidden = Dense(targets, activation="softmax")(embedding)
model = Model(inputs=deep_inputs, outputs=hidden)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.