[英]How can i get the predicted class labels in mnist tensorflow python?
after training mnist tensor flow model and below I test model.在训练 mnist 张量流 model 及以下后,我测试 model。 I don't know how we extract the predict labels results list.
我不知道我们如何提取预测标签结果列表。 It just gives me accuracy and loss value.
它只是给了我准确性和损失值。 Anyone can help me to solve this question?
任何人都可以帮我解决这个问题吗?
# Evaluate the model and print results
eval_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(
x={"x": eval_data},
y=eval_labels,
num_epochs=1,
shuffle=False)
eval_results = mnist_classifier.evaluate(input_fn=eval_input_fn)
In MNIST dataset the labels are encoded in numbers like above:在 MNIST 数据集中,标签以上面的数字编码:
0 = T-shirt/top
1 = Trouser
2 = Pullover
3 = Dress
4 = Coat
5 = Sandal
6 = Shirt
7 = Sneaker
8 = Bag
9 = Ankle boot
So when you call model.predict(test_image) this will return an array with length equal 10, where each index represent a confidence of the model that the image corresponds to each of the different articles of clothing, exactly following the order of the numbers above inside the array order, so if the array has the highest number in index 3 it means that the model predicted that this outfit is a Dress, to get the highest index in the array you can call np.argmax(model.predict(test_image)[0])
因此,当您调用 model.predict(test_image) 时,这将返回一个长度等于 10 的数组,其中每个索引代表 model 的置信度,即图像对应于每件不同的服装,完全按照上面数字的顺序在数组顺序中,因此如果数组在索引 3 中具有最高数字,则表示 model 预测这件衣服是一件连衣裙,要获得数组中的最高索引,您可以调用
np.argmax(model.predict(test_image)[0])
To facilitate the mapping of the index to a real value, that is, if the predicted clothing is a dress or boot, you can store the clothing names in an array such as:为了方便索引到实际值的映射,即如果预测的服装是连衣裙或靴子,您可以将服装名称存储在一个数组中,例如:
class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat',
'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
And in the end you can call: class_names[np.argmax(model.predict(test_image)[0])]
最后你可以调用:
class_names[np.argmax(model.predict(test_image)[0])]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.