简体   繁体   English

model.fit vgg19 tensorflow代码中的Logits和Labels维度错误(但没有类似的变量)

[英]Logits and Labels dimension error (but no variable like that) in model.fit vgg19 tensorflow code

I am using transfer learning to build a model that with three categories.我正在使用迁移学习来构建一个包含三个类别的模型。 I do not know why I am having an error due to logits and labels.我不知道为什么由于 logits 和标签而出现错误。 This is my code这是我的代码

baseModel = tf.keras.applications.VGG19(input_shape=(128,128,3), include_top=False, weights='imagenet')
baseModel.trainable = False
labels = ['glass', 'paper', 'plastic']
trainGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=trainDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
testGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=testDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
validGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=validDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
images, label = next(trainGenerator)
model.add(Input(shape=(128,128,3)))
model.add(baseModel)
model.compile(optimizer=Adam(learning_rate = 0.0001), 
                   loss='sparse_categorical_crossentropy', 
                   metrics=['sparse_categorical_accuracy'])
history = model.fit(trainGenerator, 
                    epochs=20, 
                    shuffle=True,
                    validation_data=validGenerator 
                    )

This is the error I am getting这是我得到的错误

InvalidArgumentError:  logits and labels must have the same first dimension, got logits shape [160,3] and labels shape [30]
     [[node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits
 (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:5114)
]] [Op:__inference_train_function_4832]

Errors may have originated from an input operation.
Input Source operations connected to node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits:
In[0] sparse_categorical_crossentropy/Reshape_1 (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:5109)   
In[1] sparse_categorical_crossentropy/Reshape (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:3561)

When I try to add more layers (eg flatten layer and dense with relu), I get an error saying it cannot squeeze the dimension of 3 to 1. Please, help当我尝试添加更多层时(例如,使用 relu 使层变平和密集),我收到一条错误消息,说它无法压缩 3 到 1 的维度。请帮忙

remove the code删除代码

model.add(Input(shape=(128,128,3)))

You have already specified the input shape the the VGG19 model code.您已经指定了 VGG19 型号代码的输入形状。 In the VGG code change to在 VGG 代码中更改为

baseModel = tf.keras.applications.VGG19(input_shape=(128,128,3), include_top=False, weights='imagenet', poooling='max')

This makes the output of the base model a vector that can be used as input to a dense layer.这使得基础模型的输出成为一个向量,可以用作密集层的输入。 Then add code after baseModel然后在baseModel之后添加代码

x=base_model.output
x=BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001 )(x)
x = Dense(256, kernel_regularizer = regularizers.l2(l = 0.016),activity_regularizer=regularizers.l1(0.006),
                bias_regularizer=regularizers.l1(0.006) ,activation='relu')(x)
x=Dropout(rate=.4, seed=123)(x)       
output=Dense(3, activation='softmax')(x)
model=Model(inputs=base_model.input, outputs=output)
lr=.001 # start with this learning rate
model.compile(Adamax(learning_rate=lr), loss='categorical_crossentropy', metrics=['accuracy']) 

Also in your test generator set shuffle=False同样在您的测试生成器集中 shuffle=False

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM