[英]ValueError: Input 0 of layer sequential_33 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [64, 100]
I am following this guide to learn to build a Simple RNN.我正在按照本指南学习构建简单的 RNN。 Different to the guide I just want my model to predict the next int in a ascending sequence (eg x = [1,2,3] y = [2,3,4]) But when attempting to train my model I receive this error message:与指南不同,我只希望我的模型按升序预测下一个 int(例如 x = [1,2,3] y = [2,3,4])但是在尝试训练我的模型时,我收到此错误信息:
ValueError: Input 0 of layer sequential_33 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [64, 100]
Like in the guide my dataset has shape:就像在指南中一样,我的数据集具有形状:
<BatchDataset shapes: ((64, 100), (64, 100)), types: (tf.int64, tf.int64)>
A little different form the guide my Model is defined as指南的形式略有不同,我的模型定义为
BATCH_SIZE = 64
n_neurons = 101
model = Sequential()
# shape [batch_size, timesteps, features]
model.add(Input(batch_input_shape = (BATCH_SIZE,100,1)))
model.add(LSTM(n_neurons ,return_sequences=True, stateful=True))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
print(model.summary())
with the summary beeing:总结如下:
Model: "sequential_37"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_40 (LSTM) (64, 100, 101) 41612
_________________________________________________________________
dense_28 (Dense) (64, 100, 1) 102
=================================================================
Total params: 41,714
Trainable params: 41,714
Non-trainable params: 0
_________________________________________________________________
None
Could you help me understand why I get this error, and how to fix it?你能帮我理解为什么我会收到这个错误,以及如何解决它?
I have made sure that the dataset has the same dimensions as in the guide, and provided the Input layer with the "batch_input_shape= (BATCH_SIZE,100,1)" because I learned that LSTMs need at least 3D data with shape [batch_size, timesteps, features].我已确保数据集具有与指南中相同的维度,并为输入层提供“batch_input_shape= (BATCH_SIZE,100,1)”,因为我了解到 LSTM 至少需要形状为 [batch_size, timesteps , 特征]。 So I am a confused where I'm still incorrect.所以我很困惑,我仍然不正确。
Any help yould be appreciated!任何帮助您将不胜感激!
You should feed shape (64, 100, 1) to the model instead of (64, 100).您应该为模型提供形状 (64, 100, 1) 而不是 (64, 100)。 Just add a dimension to your data只需为您的数据添加一个维度
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.