[英]How to reshape tensor for Recurrent Neural Network for LSTM layer
I am trying to train a RNN, my X input shape is (5018, 481)
and y label input shape is (5018,)
.我正在尝试训练 RNN,我的 X 输入形状是(5018, 481)
并且 y 标签输入形状是(5018,)
。 I have converted both X and y to tensors in the following format:我已将 X 和 y 转换为以下格式的张量:
x_train_tensor = tf.convert_to_tensor(X, dtype=tf.float32)
y_train_tensor = tf.convert_to_tensor(y, dtype=tf.float32)
And then with the following RNN keras model architecture:然后使用以下 RNN keras 模型架构:
model = keras.Sequential([
keras.layers.Dense(100, activation='elu', input_shape=(481,)),
keras.layers.LSTM(64, return_sequences=False, dropout=0.1, recurrent_dropout=0.1),
keras.layers.Dense(25, activation='elu'),
keras.layers.Dropout(0.5),
keras.layers.Dense(1, 'elu')
])
opt = keras.optimizers.Adam(lr=0.001)
model.compile(optimizer=opt, loss='mean_squared_error', metrics=['mse'])
model.fit(x_train_tensor, y_train_tensor, epochs=8)
I get the following error我收到以下错误
ValueError: Input 0 of layer lstm_1 is incompatible with the layer:
expected ndim=3, found ndim=2. Full shape received: [None, 100]
Anyone have a solution?有人有解决方案吗?
It expects to get 3 dimensions: if you print summary for the network without the LSTM layer:它期望获得 3 个维度:如果您打印没有 LSTM 层的网络摘要:
model = keras.Sequential([
keras.layers.Dense(100, activation='elu', input_shape=(481,)),
keras.layers.Dense(25, activation='elu'),
keras.layers.Dropout(0.5),
keras.layers.Dense(1,activation= 'elu')
])
opt = keras.optimizers.Adam(lr=0.001)
model.compile(optimizer=opt, loss='mean_squared_error', metrics=['mse'])
model.summary()
you get:你得到:
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_19 (Dense) (None, 100) 48200
_________________________________________________________________
dense_20 (Dense) (None, 25) 2525
_________________________________________________________________
dropout_7 (Dropout) (None, 25) 0
_________________________________________________________________
dense_21 (Dense) (None, 1) 26
=================================================================
Total params: 50,751
Trainable params: 50,751
Non-trainable params: 0
_________________________________________________________________
so you the 1st layer has 2 dims of (None,100).所以你的第一层有 2 个(无,100)。 the error message say that the LSTM layer require 3 dimensions, so expand 1 dim:错误消息说 LSTM 层需要 3 个维度,因此展开 1 个维度:
model = keras.Sequential([
keras.layers.Dense(100, activation='elu', input_shape=(481,)),
keras.layers.Reshape((100,1)),
keras.layers.LSTM(64, return_sequences=False, dropout=0.1, recurrent_dropout=0.1),
keras.layers.Dense(25, activation='elu'),
keras.layers.Dropout(0.5),
keras.layers.Dense(1,activation= 'elu')
])
opt = keras.optimizers.Adam(lr=0.001)
model.compile(optimizer=opt, loss='mean_squared_error', metrics=['mse'])
model.summary()
and you'll get:你会得到:
Model: "sequential_5"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_16 (Dense) (None, 100) 48200
_________________________________________________________________
reshape_2 (Reshape) (None, 100, 1) 0
_________________________________________________________________
lstm_4 (LSTM) (None, 64) 16896
_________________________________________________________________
dense_17 (Dense) (None, 25) 1625
_________________________________________________________________
dropout_6 (Dropout) (None, 25) 0
_________________________________________________________________
dense_18 (Dense) (None, 1) 26
=================================================================
Total params: 66,747
Trainable params: 66,747
Non-trainable params: 0
_________________________________________________________________
hope I helped you.希望我帮助你。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.