簡體   English   中英

如何在TensorFlow中讀取一維示例的TensorBoard直方圖?

[英]How does one read TensorBoard histograms for a 1D example in TensorFlow?

我為TensorBoard制作了最簡單的一維示例(跟蹤二次方的最小值),但是我得到的圖對我來說沒有意義,我也不知道為什么。 是我自己的實現還是TensorBoard越野車?

這是情節:

直方圖:

在此處輸入圖片說明

通常,我認為直方圖是對概率分布(或頻率計數)進行編碼的條形圖。 我假設y軸表示值,x軸表示計數? 由於我的步驟數是120,因此似乎是合理的猜測。

和標量圖:

在此處輸入圖片說明

為什么會有一條奇怪的線穿過我的地塊?

產生它的代碼(您應該能夠復制並粘貼並運行它):

## run cmd to collect model: python playground.py --logdir=/tmp/playground_tmp
## show board on browser run cmd: tensorboard --logdir=/tmp/playground_tmp
## browser: http://localhost:6006/

import tensorflow as tf

# x variable
x = tf.Variable(10.0,name='x')
# b placeholder (simualtes the "data" part of the training)
b = tf.placeholder(tf.float32)
# make model (1/2)(x-b)^2
xx_b = 0.5*tf.pow(x-b,2)
y=xx_b

learning_rate = 1.0
# get optimizer
opt = tf.train.GradientDescentOptimizer(learning_rate)
# gradient variable list = [ (gradient,variable) ]
gv = opt.compute_gradients(y,[x])
# transformed gradient variable list = [ (T(gradient),variable) ]
decay = 0.9 # decay the gradient for the sake of the example
# apply transformed gradients
tgv = [ (decay*g, v) for (g,v) in gv] #list [(grad,var)]
apply_transform_op = opt.apply_gradients(tgv)

# track value of x
x_scalar_summary = tf.scalar_summary("x", x)
x_histogram_sumarry = tf.histogram_summary('x_his', x)
with tf.Session() as sess:
    merged = tf.merge_all_summaries()
    tensorboard_data_dump = '/tmp/playground_tmp'
    writer = tf.train.SummaryWriter(tensorboard_data_dump, sess.graph)

    sess.run(tf.initialize_all_variables())
    epochs = 120
    for i in range(epochs):
        b_val = 1.0 #fake data (in SGD it would be different on every epoch)

        # applies the gradients
        [summary_str_apply_transform,_] = sess.run([merged,apply_transform_op], feed_dict={b: b_val})
        writer.add_summary(summary_str_apply_transform, i)

我也遇到了相同的問題,張量板的``實例''選項卡中出現多條線(即使我嘗試了您的代碼,板服務也顯示了重復的警告,並且只顯示了一條曲線,比我好)

WARNING:tensorflow:Found more than one graph event per run. Overwriting the graph with the newest event. 

但是,該解決方案與@Olivier Moindrot相同,請刪除舊日志,而有時該板可能會緩存一些結果,因此您可能需要重新啟動板服務。

如MINIST示例所示,確保我們顯示最新摘要的方法是登錄到新文件夾:

if tf.gfile.Exists(FLAGS.summaries_dir):
    tf.gfile.DeleteRecursively(FLAGS.summaries_dir)
tf.gfile.MakeDirs(FLAGS.summaries_dir)

鏈接到TF版本r0.10的完整源代碼: https : //github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/examples/tutorials/mnist/mnist_with_summaries.py

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM