[英]InvalidArgumentError in Tensorflow
I`m trying to create neural network using Tensorflow tools. 我正在尝试使用Tensorflow工具创建神经网络。
sizeOfRow = len(data[0])
x = tensorFlow.placeholder("float", shape=[None, sizeOfRow])
y = tensorFlow.placeholder("float")
def neuralNetworkTrain(x):
prediction = neuralNetworkModel(x)
# using softmax function, normalize values to range(0,1)
cost = tensorFlow.reduce_mean(tensorFlow.nn.softmax_cross_entropy_with_logits(prediction, y))
this is a part from the net I have got error: 这是网络的一部分,我有错误:
InvalidArgumentError (see above for traceback): logits and labels must be same size: logits_size=[500,2] labels_size=[1,500]
[[Node: SoftmaxCrossEntropyWithLogits = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/cpu:0"](Reshape, Reshape_1)]]
someone know what`s wrong? 有人知道怎么了吗?
edit: also I have got from this code: 编辑:我也从以下代码中得到:
for temp in range(int(len(data) / batchSize)):
ex, ey = takeNextBatch(i) # takes 500 examples
i += 1
# TO-DO : fix bug here
temp, cos = sess.run([optimizer, cost], feed_dict= {x:ex, y:ey})
this error TypeError: unhashable type: 'list' 此错误TypeError:无法散列的类型:'list'
Well, the error is quite self-describing. 好吧,该错误是自描述的。
logits and labels must be same size: logits_size=[500,2] labels_size=[1,500]
So, first, your labels should be transposed to have size 500, 1
and second, the softmax_cross_entropy_with_logits expects labels
to be presented in a form of a probability distribution (eg [[0.1, 0.9], [1.0, 0.0]]
). 所以,首先,你的标签应被调换为具有大小
500, 1
和第二,该softmax_cross_entropy_with_logits期望labels
在一个概率分布的形式存在(例如[[0.1, 0.9], [1.0, 0.0]]
If you know your classes are exclusive (which is probably the case), you should switch to using sparse_softmax_cross_entropy_with_logits . 如果您知道自己的类是互斥的(可能是这种情况),则应切换到使用sparse_softmax_cross_entropy_with_logits 。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.