[英]I tried to create model in tensorflow 2.x using functional API, but got LSTM layers incompatible error
Error reads:错误读取:
Input 0 of layer lstm_28 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, None, 15, 12]
In LSTM layer, input tf.nn.embedding_lookup(embedding, neighbor)
have shape =(15,12) and one None
is for batch size, how it comes the size of [None, None, 15,12]?在 LSTM 层中,输入
tf.nn.embedding_lookup(embedding, neighbor)
的形状 =(15,12),一个None
是批量大小,它的大小是怎么来的 [None, None, 15,12]? How to deal with this error?如何处理这个错误? Below is the dummy model that I created.
下面是我创建的虚拟 model。
def create_model(embedding, embedding_dim, samp_size):
node = Input(shape=(None,), dtype=tf.int64)
neighbor = Input(shape=(None, samp_size), dtype=tf.int64)
label = Input(shape=(None,), dtype=tf.int64)
cell = LSTMCell(embedding_dim,)
_,h,c = LSTM(embedding_size, return_sequences=True, return_state=True)(tf.nn.embedding_lookup(embedding, neighbor))
predict_info = tf.squeeze(Dense(1, activation='relu'))(h)
return h
node_size = 1000
embedding_dim = 12
sampling_size = 15
embedding = tf.random.uniform([node_size, embedding_dim])
model = create_model (embedding, embedding_dim, sampling_size)
when using Keras functional API, do not include the None for batch dimension.当使用 Keras 功能 API 时,不要将 None 用于批处理维度。 For example, if your input is of dimension (batch_size, image_w, image_h, image_channels) do it like this:
例如,如果您的输入尺寸为 (batch_size, image_w, image_h, image_channels),请执行以下操作:
inp = tf.keras.Input(shape=(IMG_W, IMG_H, IMG_CH))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.