繁体   English   中英

使用 NN 训练 model output 中的 ValueError

[英]ValueError in the training model output using NN

我正在尝试进行视频分类。 但是当我使用 softmax 和 categorical_crossentropy 时,我收到了错误ValueError: Shapes (None, 2) and (None, 101) are incompatible

我看到了解决此问题的另一种解决方案,我将 softmax 更改为 sigmoid,将分类更改为二元交叉熵。 现在我收到了这个错误。

ValueError: logits and labels must have the same shape ((None, 101) vs (None, 2))

我对计算机视觉和深度学习有点陌生,所以我无法立即发现错误。 有人可以帮忙吗? 我使用 google-collab 和 python3。

这是下面的代码。

base_model = VGG16(weights='imagenet', include_top=False)
X_train = base_model.predict(X_train)
X_train.shape
(2828, 7, 7, 512)

X_test = base_model.predict(X_test)
X_test.shape
(707, 7, 7, 512)

X_train = X_train.reshape(2828, 7*7*512)
X_test = X_test.reshape(707, 7*7*512)
X_train.shape
(2828, 25088)

max = X_train.max()
X_train = X_train/max
X_test = X_test/max

max 
10.2

X_train.shape 
(2828, 25088)

y_train.shape
(2828, 2)

model = Sequential()
model.add(Dense(1024, activation='relu', input_shape=(25088,)))
model.add(Dropout(0.5))
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(101, activation='sigmoid'))

from keras.callbacks import ModelCheckpoint
mcp_save = ModelCheckpoint('weight.hdf5', save_best_only=True, monitor='val_loss', mode='min')

model.compile(loss='binary_crossentropy',optimizer='Adam',metrics=['accuracy'])

model.fit(X_train, y_train, epochs=200, validation_data=(X_test, y_test), callbacks=[mcp_save], batch_size=128)

Output 错误:

Epoch 1/200
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-142-fab3445f94df> in <module>()
----> 1 model.fit(X_train, y_train, epochs=200, validation_data=(X_test, y_test), callbacks=[mcp_save], batch_size=128)

9 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
    975           except Exception as e:  # pylint:disable=broad-except
    976             if hasattr(e, "ag_error_metadata"):
--> 977               raise e.ag_error_metadata.to_exception(e)
    978             else:
    979               raise

ValueError: in user code:

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:805 train_function  *
        return step_function(self, iterator)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:795 step_function  **
        outputs = model.distribute_strategy.run(run_step, args=(data,))
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:1259 run
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:2730 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:3417 _call_for_each_replica
        return fn(*args, **kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:788 run_step  **
        outputs = model.train_step(data)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:756 train_step
        y, y_pred, sample_weight, regularization_losses=self.losses)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/compile_utils.py:203 __call__
        loss_value = loss_obj(y_t, y_p, sample_weight=sw)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:152 __call__
        losses = call_fn(y_true, y_pred)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:256 call  **
        return ag_fn(y_true, y_pred, **self._fn_kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py:201 wrapper
        return target(*args, **kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:1608 binary_crossentropy
        K.binary_crossentropy(y_true, y_pred, from_logits=from_logits), axis=-1)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py:201 wrapper
        return target(*args, **kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/backend.py:4979 binary_crossentropy
        return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py:201 wrapper
        return target(*args, **kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/nn_impl.py:174 sigmoid_cross_entropy_with_logits
        (logits.get_shape(), labels.get_shape()))

    ValueError: logits and labels must have the same shape ((None, 101) vs (None, 2))

您的问题是多类分类还是二元分类?

如果是多类

使用model.add(Dense(101, activation='softmax'))作为最后一层。 用于多类分类的激活 function 是softmax 另外,将loss='binary_crossentropy'更改为loss='categorical_crossentropy'

如果二进制分类

使用model.add(Dense(2, activation='softmax'))model.add(Dense(1, activation='sigmoid'))作为最后一层(两者都很好)。

因为softmax激活将 output 概率。 例如[0.3, 0.7] 然后使用argmax可以得到预测 class 如下[0,1]表示预测属于第二个 class。

最后一层的单元数必须与 label 的 output 尺寸相匹配。

如果需要,请要求澄清。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM