简体   繁体   English

张量流。 [batch_size, 1] & [batch_size] 的区别

[英]Tensorflow. Difference between [batch_size, 1] & [batch_size]

In tensorflow tutorial for word embedding one finds:词嵌入的 tensorflow 教程中发现:

# Placeholders for inputs
train_inputs = tf.placeholder(tf.int32, shape=[batch_size])
train_labels = tf.placeholder(tf.int32, shape=[batch_size, 1])

What is possibly the difference between these two placeholders.这两个占位符之间可能有什么区别。 Aren't they both a int32 column vector of size batch_size?它们不是大小为 batch_size 的 int32 列向量吗?

Thanks.谢谢。

I found the answer with a little debugging.我通过一些调试找到了答案。

[batch_size] = [ 0, 2, ...]
[batch_size, 1] = [ [0], [2], ...]

Though still don't know why using the second form.虽然仍然不知道为什么使用第二种形式。

train_inputs是行向量,而train_labels是列向量。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM