简体   繁体   中英

Why don't I get the same result as with tensorflow's method when I write my own expression?

I'm learning logistic regression and I want to calculate what is the value of the cross entropy loss function during minimizing it via gradient descent, but when I use tensorflow's sigmoid_cross_entropy_with_logits function, I get different result from what I get via my own expression.

Here is an example:

import numpy as np
import tensorflow as tf

pred = np.array([[0.2],[0.3],[0.4]])
test_y = np.array([[0.5],[0.6],[0.7]])
print(tf.nn.sigmoid_cross_entropy_with_logits(logits = pred, labels = test_y))
print(-test_y * tf.math.log(pred) - (1-test_y) * tf.math.log(1-pred))

The output:

tf.Tensor(
[[0.69813887]
 [0.67435524]
 [0.63301525]], shape=(3, 1), dtype=float64)
tf.Tensor(
[[0.91629073]
 [0.86505366]
 [0.7946512 ]], shape=(3, 1), dtype=float64)

Can anyone explain to me what's wrong with this? I checked the tensorflow documentation about their function, and it seems like it should be doing exactly the same as my expression.

You forgot to take the sigmoid of your predictions pred before computing the cross entropy loss:

-test_y * tf.math.log(tf.math.sigmoid(pred)) - (1-test_y) * tf.math.log(1-tf.math.sigmoid(pred))
tf.Tensor(
[[0.69813887]
 [0.67435524]
 [0.63301525]], shape=(3, 1), dtype=float64)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM