简体   繁体   中英

C++ TensorFlow SoftmaxCrossEntropWithLogits returns (cost, gradients), how to access cost?

I am trying to implement a simple neural network in C++ TensorFlow.

I am unable to access loss returned by SoftmaxCrossEntropyWithLogits function ( https://www.tensorflow.org/api_docs/cc/class/tensorflow/ops/softmax-cross-entropy-with-logits ).

Please check the "Returns" segment in above mentioned link. I want to access only "loss" output. I am unable to write this in C++.

Thanks

here it doesn't return anything, its an attribute. You can access it from the class instance.

#define the loss class instance with your logits and labels
softmax_loss_function = SoftmaxCrossEntropyWithLogits(const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels) 
# access the loss
loss = softmax_loss_function.loss

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM