简体   繁体   中英

How to get the trained weights created by a model

I implemented a simple logistic regression. Before running the training algorithm, I created a placeholder for my weights where I initialized all the weights to 0...

W = tf.Variable(tf.zeros([784, 10]))

After initializing all my variables correctly, the logistic regression is implemented (which I've tested and runs correctly)...

for epoch in range(training_epochs):
    avg_cost = 0
    total_batch = int(mnist.train.num_examples/batch_size)
    # loop over all batches
    for i in range(total_batch):
        batch_xs, batch_ys = mnist.train.next_batch(batch_size)
        _, c = sess.run([optimizer, cost], feed_dict={x: batch_xs, y: batch_ys})

        # compute average loss
        avg_cost += c / total_batch
    # display logs per epoch step
    if (epoch + 1) % display_step == 0:
        print("Epoch:", '%04d' % (epoch + 1), "cost=", "{:.9f}".format(avg_cost))

My issue is, I need to extract the weights used in the model. I used the following for my model...

pred = tf.nn.softmax(tf.matmul(x, W) + b)  # Softmax

I tried extracting the following way...

var = [v for v in tf.trainable_variables() if v.name == "Variable:0"][0]
print(sess.run(var[0]))

I thought that the trained weights would be located in tf.training_variables() , however when I run the print function, I get an array of zeroes.

What I want, is the all sets of weights. But for some reason I am getting arrays of zeroes instead of the actual weights of the classifier.

The variable W should refer to the trained weights. Please try simply doing: sess.run(W)

简单得多,只需使用run函数评估权重,您将获得具有值的numpy数组:

sess.run([x, W, b])

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM