簡體   English   中英

Keras-CNN輸入形狀不兼容

[英]Keras - CNN input shape incompatible

我正在使用二進制分類時,當我使用CNN時,我的代碼在Keras Lstm上運行正常,我收到輸入形狀不兼容錯誤。

這是我得到的價值錯誤

ValueError:檢查目標時出錯:預期density_61具有3維,但數組的形狀為(24,1)

這是我使用keras的cnn代碼

model=Sequential()
inputBatch = inputBatch.reshape(24,30, 1)
model.add(Conv1D(64, 3, activation='relu', input_shape=(30, 1)))
model.add(Conv1D(64, 3, activation='relu'))
model.add(MaxPooling1D(pool_size=4,strides=None, padding='valid'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])
model.fit(inputBatch,ponlabel,batch_size=24,epochs=20,validation_data=(inputBatch, ponlabel))

我正在研究二進制分類,它將是正數或負數

供參考,這是我的lstm代碼

inputBatch =inputBatch.reshape(24,30,1)
model=Sequential()
model.add(LSTM(50, input_shape=(30, 1)))
model.add(Dense(1, activation="relu"))
model.compile(loss='mean_absolute_error',optimizer='adam')
model.fit(inputBatch,ponlabel,batch_size=24,epochs=100,verbose=1)

inputBatch是類似的東西,它正在處理LSTM代碼,但不適用於CNN,這是我分別用於訓練這兩個代碼的輸入

[[    0.  1288.  1288.  2214. 11266.  6923.   420.     0.     0.  8123.
      0.  7619.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.     0.     0.     0.     0.     0. 11516.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  9929. 11501.  6573. 11266.  7566.  9963.  4420. 10936.  3657.
   7050.     0.   408. 11501.  9988.  9963.  8455.  2879.  9322.  2047.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 11956.  5222.     0.     0. 12106.  6481.     0.  7093. 13756.
  12152.     0.     0.     0.     0. 10173.     0.  5173. 13756.  9371.
      0.  9956.     0.     0.  9716.     0.     0.     0.     0.     0.]
 [    0.     0.   420.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 11501.  1916.  2073. 10936.  6312.     0. 10193. 10322.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  2879.  7852. 11501.  1934.   286. 11483.     0. 12004. 11118.
      0. 12007.  9917. 12111.  1520. 10364.     0.  8840.  4195.  2910.
  10773. 11386. 12117.  9321.     0.     0.     0.     0.     0.     0.]
 [    0.  7885.  7171.  1034. 11501.  3103.  5842.  4395. 11871.  3328.
   6719.  5407.  1087.  8935.  2937.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  8894.   450. 11516.  7353. 11501. 11502. 11499.     0.  1319.
  11693. 11501.  5735. 12111.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  1087.  9565.    23.     0.  3045.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  5015. 11501.  3306. 12111.  9307.  5050. 11501.  3306.     0.
   3306. 12111.  1981. 11516.   615. 11516.     0.  3925. 11956.  9371.
   9013.  4395. 12111.  5048.     0.  3925.     0.     0.     0.     0.]
 [    0.  1287.   420.  4070. 11087.  7410. 12186.  2387. 12111.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.   128.  2073. 10936.  6312.     0. 10193. 10322.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 10173.  9435.  1320.  9322. 12018.  1055.  8840.  6684. 12051.
   2879.     0. 12018.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  1570.  5466.  9322.    34. 11480.  1356. 11270.   420.  2153.
  12006.  5157.  8840.  1055. 11516.  7387.  2356.  2163.  2879.  5541.
   9443.  7441.  1295.  5473.     0.     0.     0.     0.     0.     0.]
 [    0.  5014.     0.     0.  3651.  1087.    63.  6153.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 10608. 10855.  9562.     0.     0.     0.  4202.     0.     0.
      0. 10818. 10818.  5842.     0.  9963.     0. 11516. 10464.  7491.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  5952.  6133.   450.  7520.  5842.  3412. 10400.  3412.  2149.
   4891.  2979.  3456.   505.  9929. 11501.  9322.  1836. 11501. 12111.
   3435. 11105. 11266.   420.  9322.    34.     0.     0.     0.     0.]
 [    0.  1570.  5466.  9322.    34. 11480.  1356. 11270.   420.  2153.
  12006.  5157.  8840.  1055. 11516.  7387.  2356.  2163.  2879.  5541.
   9443.  7441.  1295.  5473.     0.     0.     0.     0.     0.     0.]
 [    0.  7544.     0.  1709.   420. 10936.  5222.  5842. 10407.  6937.
  11329.  2937.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  7785.  8840.     0.   420.  8603. 12003.  2879.  1087.  2356.
   2390. 12111.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  8695.  8744.   420.  8840.  6697.  9267. 11516. 11203.  2260.
   8840.  7309.     0. 11100.  6041.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  9307. 12003.  2879.  6398.  9372.  4614.  5222.     0.     0.
   2879. 10364.  6923.  4709.  4860. 11871.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.     0.  2844.  1287.   420. 11501.   610. 11501.   596.     0.
  12111.  3690.  6343.  9963.     0.     0.  8840.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]]

問題在於輸出形狀,因為您使用的是CNN,所以輸出為3D(樣本,寬度,通道),並且“密集”層將在最后一個尺寸上運行,從而為您提供3D輸出。 但是您需要2D輸出,因此需要添加一個Flatten層:

model=Sequential()
model.add(Conv1D(64, 3, activation='relu', input_shape=(30, 1)))
model.add(Conv1D(64, 3, activation='relu'))
model.add(MaxPooling1D(pool_size=4,strides=None, padding='valid'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(1, activation='sigmoid'))

您可以通過執行model.summary()來比較此模型和原始模型的輸出形狀

當輸入圖像的大小不同時,可能會出現這種錯誤。

添加更多信息(由於沒有足夠的代表,我不能說這),可能包括完整的堆棧跟蹤信息。

輸入形狀不兼容是由於ponlabel 對於LSTM,其形狀為(24,1)。 但是CNN使用的是binary_crossentropy來彌補損失,因此它將具有兩個目標類別。 這意味着對於CNN, ponlabels必須具有形狀(24,2,1)。

您需要使用MSE或分類交叉熵來避免CNN丟失

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM