简体   繁体   中英

are new weights generated for each iteration/run in tensorflow?

def trainx(x):
    train = tf.train.GradientDescentOptimizer(x).minimize(error)
    return train

with tf.Session() as sess:
    for i in [0.01,0.02,0.03, 0.04]: #
        merge = tf.summary.merge_all()
        tf.global_variables_initializer().run()



        writter = tf.summary.FileWriter('4004/'+str(i), sess.graph)

        for i1 in range(100):

            error_sum = sess.run(merge, {x:inp, y:out})
            writter.add_summary(error_sum, i1)
            sess.run(trainx(i), {x:inp, y:out})

just passing you part of the code, to make things more simple. Please look at the picture bellow:

在此处输入图片说明

If you see red line starts at about 0.370 and blue at about 0.310. Does this mean, initial weight is not the same for all runs in tensorflow? Because if it would be, all lines would start from same point, considering Gradient decent is applied after error/loss function. I think for each iteration, new weights are generated and this is not I am looking for. How could I fix this? would appreciate it.

This is because the default initializer for the weights is random. You can make it better by setting the random seed. https://www.tensorflow.org/api_docs/python/tf/set_random_seed

tf.set_random_seed(666)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM