简体   繁体   中英

Tensorflow subtract returns wrong values

absolute tensorflow beginner here. I am trying to construct two random tensors and subtract them for an assignment. However I seem to have some issues with understanding how exactly the subtraction process works.

x=tf.random_normal([5],seed=123456)
y=tf.random_normal([5],seed=987654)
print(sess.run(x),sess.run(y))

I get the following outputs:

[ 0.38614973  2.97522092 -0.85282576 -0.57114178 -0.43243945]
[-0.43865281  0.08617876 -2.17495966 -0.24574816 -1.94319296]

But when I try

print(sess.run(x-y))

I get

[-1.88653958 -0.03917438  0.87480474  0.40511152  0.52793759]

Now if I run

print(sess.run(tf.subtract(x,y)))

I also get other wrong values.

[-1.97681355  1.10086703  1.41172433  1.55840468  0.04344697]

I hope somebody can help me out here. Thanks in advance!

This behaviour actually has to do with how the seed of your normal works, and how the session evaluates your nodes.

Tensorflow will use the seed of your random normal nodes when it creates them - not when it runs them :

>>> sess = tf.InteractiveSession()
>>> x = tf.random_normal([5], seed=123456)
>>> sess.run(x)
array([ 0.38614976,  2.97522116, -0.85282576, -0.57114178, -0.43243945], dtype=float32)
>>> sess.run(x)
array([-1.41140664, -0.50017333,  1.59816611,  0.07829454, -0.36143178], dtype=float32)

You can see that the values change when running x a second time. Running sess.run(xy) will actually run x (ie generate random numbers), then y (ie generate other random numbers), then xy . Since you're not reinitializing the random generator with the seed before running tf.subtract(x,y) , you get different results.

This problem occurs when you executes x - y multiple times since each time x and y will be assigned a different value. This is because when you write something like x=tf.random_normal([5],seed=123456) There really isn't any actual computation. TensorFlow is just constructing an operation node within the static computation graph. It is when you do sess.run() real computation happens.

So, consider the x=tf.random_norm([5], seed=123456) as a random number generator. The first time you call sess.run() , x has initial seed value 123456 . But the second time you call sess.run() the state of the random number generator has already changed, so the value will be different.

You can verify this by running the following code:

import tensorflow as tf
x = tf.random_normal([5], seed=123456)

with tf.Session() as sess:
    sess.run(x)
    sess.run(x)
    sess.run(x)

The output will be

[ 0.38614973,  2.97522092, -0.85282576, -0.57114178, -0.43243945]
[-1.41140664, -0.50017339,  1.59816611,  0.07829454, -0.36143178]
[-1.10523391, -0.15264226,  1.79153454,  0.42320547,  0.26876169]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM