简体   繁体   English

使用 Deeplabv3+ keras 进行二元语义分割(专为多类语义分割而设计)

[英]Binary semantic Segmentation with Deeplabv3+ keras (designed for multiclass semantic segmentation)

I am new to Keras so sorry if the question is silly.如果问题很愚蠢,我是 Keras 的新手,很抱歉。

I found here https://keras.io/examples/vision/deeplabv3_plus/ the deeplabv3+ model to perform multiclass semantic segmentation.我在这里找到了https://keras.io/examples/vision/deeplabv3_plus/ deeplabv3+ 模型来执行多类语义分割。 I need to adapt this code to another purpose, because I need to perform binary semantic segmentation on medical images.我需要将此代码用于另一个目的,因为我需要对医学图像执行二进制语义分割。 Is it correct to change from改成从

NUM_CLASSES = 20 NUM_CLASSES = 20

to NUM_CLASSES = 1? NUM_CLASSES = 1?

If I put NUM_CLASSES = 2, I get an error about mismatch between logits and labels.如果我把 NUM_CLASSES = 2,我会得到一个关于 logits 和标签之间不匹配的错误。

About the loss function, the code line is loss = keras.losses.SparseCategoricalCrossentropy(from_logits=True)关于损失函数,代码行是 loss = keras.losses.SparseCategoricalCrossentropy(from_logits=True)

I thought to change it to我想把它改成

loss = keras.losses.BinaryCrossentropy(from_logits=True)损失 = keras.losses.BinaryCrossentropy(from_logits=True)

but loss becomes negative.但损失变成负数。 Should I add something else?我应该添加其他东西吗?

Thank you!谢谢!

Edit: the deeplabv3+ for multiclass semantic segmentation uses keras.activations.linear(x) in the last layer.编辑:用于多类语义分割的 deeplabv3+ 在最后一层使用 keras.activations.linear(x)。 For my purpose, should I use softmax instead of keras.activations.linear(x) with BinaryCrossEntropy and put from_logits=False?出于我的目的,我应该使用 softmax 而不是 keras.activations.linear(x) 和 BinaryCrossEntropy 并放置 from_logits=False?

I solved the problem, if anyone needs the answer: "... for binary segmentation, it's preferable to keep NUM_CLASS = 1 since you're trying to predict a binary mask that represents a single class against the background. If you wish to predict a one-hot-encoded segmentation mask (in the current context, setting NUM_CLASS = 2), only then use softmax activation along with sparse categorical cross-entropy loss otherwise use sigmoid activation along with binary cross-entropy."我解决了这个问题,如果有人需要答案:“...对于二进制分割,最好保持 NUM_CLASS = 1,因为您试图预测一个代表背景中单个类的二进制掩码。如果你想预测one-hot-encoded 分割掩码(在当前上下文中,设置 NUM_CLASS = 2),然后才使用 softmax 激活和稀疏分类交叉熵损失,否则使用 sigmoid 激活和二进制交叉熵。”

https://github.com/keras-team/keras-io/issues/648 https://github.com/keras-team/keras-io/issues/648

在遮罩文件中,您使用颜色 [255, 255, 255] 作为遮罩还是 [1, 1, 1]?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM