简体   繁体   English

张量流图正则化 (NSL) 如何影响三重半硬损失 (TFA)

[英]How tensorflow graph regularization (NSL) affects triplet semihard loss (TFA)

I want to train a binary target deep neural network model using nsl.keras.GraphRegularization as described in this tutorial .我想使用nsl.keras.GraphRegularization训练一个二元目标深度神经网络模型,如本教程中所述 My model has a triplet semihard loss in an intermediate dense layer which should not be "graph regularized".我的模型在不应该被“图形正则化”的中间密集层中有一个三重半硬损失

From the nsl.keras.GraphRegularization definition on Github:来自 Github 上的nsl.keras.GraphRegularization 定义

Incorporates graph regularization into the loss of base_model .将图正则化合并到base_model的损失中。

Graph regularization is done on the logits layer and only during training.图正则化是在 logits 层上完成的,并且仅在训练期间进行。

It means that the intermediate triplet semihard loss will not be affected by this regularization?就是说中间的triplet semihard loss不会受到这个正则化的影响?

Yes, that's right.恩,那就对了。 Graph regularization will only be applied on the outputs of base_model .图正则化将仅应用于base_model的输出。 If your base_model uses triplet semihard loss in another layer, that loss should remain unaffected and preserved.如果您的base_modelbase_model使用三重半硬损失,则该损失应保持不受影响并保留。 If that's the not the case, please file a bug athttps://github.com/tensorflow/neural-structured-learning/issues .如果情况并非如此,请在https://github.com/tensorflow/neural-structured-learning/issues提交错误。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM