简体   繁体   中英

Issue with TensorFlow-TensorBoard

I'm following this tutorial . I've copied the code literally, so I don't see any error. The errors come when I try to add this line to create a file for TensorBoard:

logs_path = '/tensor_board'
writer = tf.summary.FileWriter(logs_path, graph=tf.get_default_graph())
# RUN
sess.run(init, writer)

When i run the entire code Python returns this as an error:

Traceback (most recent call last):
  File "tf_number_recon.py", line 39, in <module>
    sess.run(init, writer)
  File "C:\Python35\lib\site-packages\tensorflow\python\client\session.py", line 766, in run
    run_metadata_ptr)
  File "C:\Python35\lib\site-packages\tensorflow\python\client\session.py", line 913, in _run
    feed_dict = nest.flatten_dict_items(feed_dict)
  File "C:\Python35\lib\site-packages\tensorflow\python\util\nest.py", line 171, in flatten_dict_items
    raise TypeError("input must be a dictionary")
TypeError: input must be a dictionary

I don't see why it doesn't work as exxpected. Could enyone help me?

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

# importing the dataset used to train the Neural Network
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)

# importing Tensorflow
import tensorflow as tf                                                          

import argparse
import sys

# Declaring some imjmportant variables
x = tf.placeholder(tf.float32, [None, 784])                                 # x is 
W = tf.Variable(tf.zeros([784, 10]))                                        # W creará 10 vectores de evidencia, uno para cada numero entre 0-9
b = tf.Variable(tf.zeros([10]))                                             # b is 
y = tf.nn.softmax(tf.matmul(x, W) + b)                                      # y será la salida. Aqui definimos el modelo
y_ = tf.placeholder(tf.float32, [None, 10])                                 #

# Cross Entropy: mide lo lejos que nuestra predicción está de la realidad, para así mejorar la red neuronal (no controla lo bien que lo hace, sino más bien lo mal que lo hace)
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))

# Se pide que durante el proceso se minimize el cross entropy
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)

# initializing the variables
init = tf.global_variables_initializer()

# Run a session and initialize the operations
sess = tf.Session()
# Tensor Board
logs_path = '/tensor_board'
writer = tf.summary.FileWriter(logs_path, graph=tf.get_default_graph())
# RUN
sess.run(init, writer)

# Loop for training
for i in range(1000):
  batch_xs, batch_ys = mnist.train.next_batch(100)
  sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})

# Evaluate the model
correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
eficacia = sess.run(accuracy, feed_dict={x: mnist.test.images, y_: mnist.test.labels})
print(eficacia)

Making these changes the code runs without any problems

logs_path = './tensor_board' 

sess.run(init)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM