简体   繁体   English

Keras:使用categorical_crossentropy而不使用单热编码的目标数组

[英]Keras: Use categorical_crossentropy without one-hot encoded array of targets

I have a Keras model that I'm using for a multi-class classification problem. 我有一个Keras模型,我用于多类分类问题。 I'm doing this: 我这样做:

model.compile(
    loss='categorical_crossentropy',
    optimizer='adam',
    metrics=['accuracy'],
)

I currently have ~100 features and there are ~2000 possible classes. 我目前有~100个功能,有~2000个可能的课程。 One-hot encoding the class is leading to memory issues. 对该类进行单热编码会导致内存问题。

Is it possible to use categorical_crossentropy with this Keras model while not one-hot encoding the class labels. 是否可以对此Keras模型使用categorical_crossentropy ,而不是对类标签进行单热编码。 Eg instead of having a target look like: 例如,而不是让目标看起来像:

[0, 0, 0, 1, 0, 0, ...]

It would just be: 它只会是:

3

I looked at the source for categorical_crossentropy in Keras and it assumes two tensors of the same shape. 我查看了Keras中的categorical_crossentropy的来源,它假设两个相同形状的张量。 Is there a way to get around this and use the approach I described? 有没有办法绕过这个并使用我描述的方法?

Thanks! 谢谢!

If your targets are one-hot encoded, use categorical_crossentropy . 如果目标是单热编码,请使用categorical_crossentropy Examples of one-hot encodings: 单热编码的示例:

[1,0,0]
[0,1,0]
[0,0,1]

However, if your targets are integers, use sparse_categorical_crossentropy . 但是,如果目标是整数,请使用sparse_categorical_crossentropy Examples of integer encodings: 整数编码的示例:

1
2
3

Could you post the rest of your code? 你可以发布剩下的代码吗? by my understanding when using categorical crossentropy as loss function, the last layer should use a softmax activation function, yielding for each output neuron the probability of the input corresponding to said neuron's class, and not directly the one-hot vector. 根据我的理解,当使用分类交叉熵作为损失函数时,最后一层应该使用softmax激活函数,为每个输出神经元产生对应于所述神经元类的输入概率,而不是直接产生单热矢量。 Then the categorical crossentropy is calculated as 然后将分类交叉熵计算为

在此输入图像描述

where the p 's are these probabilities. p是这些概率。 By just outputting the class you wouldn't have access to these probabilities and thus wouldn't be able to compute the categorical crossentropy. 通过输出类,您将无法访问这些概率,因此无法计算分类交叉熵。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 为什么 keras model 如果与 one-hot 标签和 categorical_crossentropy 和 softmax output 一起使用,则将所有预测为 1 - Why does keras model predicts all as ones if used with one-hot labels and categorical_crossentropy amnd softmax output 如何在keras中实现categorical_crossentropy? - How is the categorical_crossentropy implemented in keras? Keras Categorical_crossentropy 损失实现 - Keras Categorical_crossentropy loss implementation 在 Keras 中为“categorical_crossentropy”选择验证指标 - Selecting validation metric for `categorical_crossentropy` in Keras One-Hot 编码的 Keras 自定义损失 - Keras Custom Loss for One-Hot Encoded Keras:binary_crossentropy 和 categorical_crossentropy 混淆 - Keras: binary_crossentropy & categorical_crossentropy confusion 在各自的单热编码列中填充分类数据的值 - Populate values for categorical data in their respective one-hot encoded columns 没有 for 循环的 Keras 中的单热编码 - One-hot encodings in Keras without for loops 如何使用分类的单热标签进行Keras培训? - How can I use categorical one-hot labels for training with Keras? keras自定义生成器categorical_crossentropy修复输出形状问题 - keras custom generator categorical_crossentropy fix output shape issue
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM