[英]ValueError in TensorFlow 'categorical_crossentropy'
[英]"ValueError: Shapes (None, 1) and (None, 32) are incompatible" when training image classification network in TensorFlow using categorical_crossentropy
我正在嘗試訓練機器學習 model 來對圖像進行分類,但是當我嘗試使用 categorical_crossentropy 損失 function 時遇到了一些問題。
這是我用來生成 model 的代碼。
import numpy as np
import os
import PIL
import PIL.Image
import tensorflow as tf
import pathlib
import glob
import matplotlib.pyplot as plt
from tensorflow.keras import layers
from tensorflow.keras import callbacks
from tensorflow import keras
from datetime import datetime
import tensorboard
if __name__ == "__main__":
#This first section mostly follows the tutorial at https://www.tensorflow.org/tutorials/images/classification
data_dir = "img_directories"
image_count = len(list(glob.glob(f'{data_dir}/*/*.png')))
print(image_count)
batch_size = 128
img_height = 100
img_width = 100
#Set up training data
val_split = 0.2
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=val_split,
subset="training",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size)
#Set up testing data
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=val_split,
subset="validation",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size,
color_mode='rgb')
class_names = train_ds.class_names
print(class_names)
num_classes = len(train_ds.class_names)
# Normalize data
normalization_layer = layers.experimental.preprocessing.Rescaling(1./255)
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
#Set up model
model = tf.keras.Sequential()
# model.add(layers.experimental.preprocessing.Rescaling((1./255),input_shape=(100, 100, 3)))
model.add(layers.Conv2D(64, (3,3), activation='relu',input_shape=(img_height, img_width, 3)))
model.add(layers.MaxPooling2D(pool_size=(2,2)))
model.add(layers.Dropout(0.2))
model.add(layers.Conv2D(64, (5,5)))
model.add(layers.MaxPooling2D(pool_size=(3,3)))
model.add(layers.Dense(64))
model.add(layers.Flatten())
model.add(layers.Dense(num_classes, activation='softmax'))
model.compile(optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy'])
model.summary()
earlystopping = callbacks.EarlyStopping(monitor ="val_loss",
mode ="min", patience = 7,
restore_best_weights = True)
history=model.fit(
normalized_ds,
validation_data=val_ds,
epochs=100,
callbacks=[earlystopping]
)
將損失 function 設置為categorical_crossentropy
會給我以下錯誤:
ValueError: Shapes (None, 1) and (None, 32) are incompatible
其中 32 是我擁有的數據集中的類數,因此我的 output 層存在問題。
但是,當我嘗試使用sparse_categorical_crossentropy
運行它時,它似乎沒有問題
因為我的課程太少,我該如何讓它與categorical_crossentropy
一起工作?
編輯:
我已經嘗試過類似的方法,但我仍然遇到與原始錯誤類似的錯誤。
val_imgs, val_labels = next(iter(val_ds))
val_labels_one_hot=tf.one_hot(labels_batch,num_classes)
model.compile(optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy'])
model.summary()
earlystopping = callbacks.EarlyStopping(monitor ="val_loss",
mode ="min", patience = 7,
restore_best_weights = True)
history=model.fit(
train_ds,
validation_data=[val_imgs,val_labels_one_hot],
epochs=100,
callbacks=[earlystopping]
)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.