繁体   English   中英

自定义 Keras 指标函数(召回率、精度、F1 分数)无法加载 H5 model

[英]Custom Keras metric functions (Recall, Precision, F1 Score) isn't enabling the Loading of H5 model

我正在为 Keras 使用以下自定义指标:

def mcor(y_true, y_pred):
    #matthews_correlation
    y_pred_pos = K.round(K.clip(y_pred, 0, 1))
    y_pred_neg = 1 - y_pred_pos 
    y_pos = K.round(K.clip(y_true, 0, 1))
    y_neg = 1 - y_pos
    tp = K.sum(y_pos * y_pred_pos)
    tn = K.sum(y_neg * y_pred_neg)
    fp = K.sum(y_neg * y_pred_pos)
    fn = K.sum(y_pos * y_pred_neg)
    numerator = (tp * tn - fp * fn)
    denominator = K.sqrt((tp + fp) * (tp + fn) * (tn + fp) * (tn + fn))
    return numerator / (denominator + K.epsilon())


def precision(y_true, y_pred):
    """Precision metric.

    Only computes a batch-wise average of precision.

    Computes the precision, a metric for multi-label classification of
    how many selected items are relevant.
    """
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
    precision = true_positives / (predicted_positives + K.epsilon())
    return precision


def recall(y_true, y_pred):
    """Recall metric.

    Only computes a batch-wise average of recall.

    Computes the recall, a metric for multi-label classification of
    how many relevant items are selected.
    """
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
    recall = true_positives / (possible_positives + K.epsilon())
    return recall


def f1(y_true, y_pred):
    def recall(y_true, y_pred):
        """Recall metric.

        Only computes a batch-wise average of recall.

        Computes the recall, a metric for multi-label classification of
        how many relevant items are selected.
        """
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = true_positives / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        """Precision metric.

        Only computes a batch-wise average of precision.

        Computes the precision, a metric for multi-label classification of
        how many selected items are relevant.
        """
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = true_positives / (predicted_positives + K.epsilon())
        return precision
    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))

这是编译语句:

model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy', precision, recall, f1])

使用 ModelCheckpoint,Keras model 会自动保存为最佳 model。 分类类别已经过一次热编码。

但是,当保存的 model 使用以下方式加载回来时:

# load model
from keras.models import load_model

custom_obj = {'accuracy':accuracy, 'Loss':Loss, 'precision':precision, 'recall':recall, 'f1':f1}
model = load_model('Asset_3_best_model.h5', custom_objects=custom_obj)

此处列出了来自先前定义的自定义 Keras 函数的自定义对象。

当从 memory 加载 model 时,我观察到以下错误:

ValueError:('无法解释度量 function 标识符:',0.8701059222221375)

我尝试了许多不同的自定义函数,但找不到重新加载我保存的 model 的解决方案。 这是一个多分类时间序列挑战,我希望了解是否有更简单的方法来解决这个度量计算。

我也在努力寻找一种方法来计算我的二进制分类问题的 F1 分数。 我遇到了 TensorFlow 的教程,它对我有用:https://www.tensorflow.org/tutorials/structured_data/imbalanced_data

虽然,它不是自定义的,而是直接实现的。

METRICS = [
      keras.metrics.TruePositives(name='tp'),
      keras.metrics.FalsePositives(name='fp'),
      keras.metrics.TrueNegatives(name='tn'),
      keras.metrics.FalseNegatives(name='fn'), 
      keras.metrics.BinaryAccuracy(name='accuracy'),
      keras.metrics.Precision(name='precision'),
      keras.metrics.Recall(name='recall'),
      keras.metrics.AUC(name='auc'),
]

在此之后,您必须在编译 function 时添加一个参数:

model.compile(...,metrics=METRICS)

我为我的代码注释了 tf、fp、tn、fn 并低于 output:

Train on 2207 samples, validate on 552 samples
Epoch 1/6
 - 7s - loss: 1.2502 - accuracy: 0.6357 - precision: 0.4252 - recall: 0.0688 - auc: 0.5138 - val_loss: 0.6229 - val_accuracy: 0.6667 - val_precision: 0.8000 - val_recall: 0.0214 - val_auc: 0.6800
Epoch 2/6
 - 7s - loss: 0.6451 - accuracy: 0.6461 - precision: 0.7500 - recall: 0.0076 - auc: 0.5735 - val_loss: 0.6368 - val_accuracy: 0.6685 - val_precision: 0.8333 - val_recall: 0.0267 - val_auc: 0.7144
...

检查这是否解决了您的问题。 如果我错过了什么,请告诉我。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM