简体   繁体   English

tensorflow 2.0 中的 x.shape 和 tf.shape() 有什么区别?

[英]What is the difference between x.shape and tf.shape() in tensorflow 2.0?

I wonder when I need to use tf.shape() and x.shape().我想知道什么时候需要使用 tf.shape() 和 x.shape()。 I'm currently using tensorflow 2.0 rc0我目前正在使用 tensorflow 2.0 rc0

The following is an example code.下面是一个示例代码。

#!/usr/bin/python3

import tensorflow as tf

a = tf.zeros((4, 3, 1)) 
print (tf.shape(a).numpy())
print (a.shape)

The result of the above code is as follows:上述代码的结果如下:

[4 3 1]
(4, 3, 1)

tf.shape(a).numpy() returns the numpy array whereas a.shape returns a tuple, but I cannot easily find which one is better and which one should be preferred. tf.shape(a).numpy()返回 numpy 数组,而a.shape返回一个元组,但我无法轻易找到哪个更好,哪个应该首选。

Could anyone please give some advice on this?任何人都可以就此提供一些建议吗?

.numpy() on any Tensor or Tensorflow ops will return you numpy.ndarray .任何TensorTensorflow ops上的.numpy()将返回numpy.ndarray

Example:例子:

a = tf.constant([1,2,3])
print(a.numpy())
print(tf.shape(a).numpy())
print(type(tf.shape(a)))

[1 2 3] [1 2 3]

[3] [3]

<class 'tensorflow.python.framework.ops.EagerTensor'>

But Tensor.shape() will be having type TensorShape which will return a tuple.但是Tensor.shape()将具有类型TensorShape ,它将返回一个元组。

print(type(a.shape)) 

<class 'tensorflow.python.framework.tensor_shape.TensorShape'>

Even NumPy arrays have a shape attribute that returns a tuple of the length of each dimension of the array.甚至 NumPy arrays 也有一个 shape 属性,该属性返回数组每个维度长度的元组。

data = np.array([11, 22, 33, 44, 55])
print(data.shape)

(5,) (5,)

The ideal way to use tensor shape in any of your operations would be tf.shape(a) without having to convert into .numpy() or use Tensor.shape在任何操作中使用tensor shape的理想方法是tf.shape(a)而不必转换为.numpy()或使用Tensor.shape

Hope this answers your question, Happy Learning!希望这能回答你的问题,快乐学习!

I think one important difference is also that if you try to access the shape of tensor dimension which is not known in advance (eg None) using我认为一个重要的区别是,如果您尝试使用事先不知道的张量维度的形状(例如无)

tensor.shape

you will fail when building the graph of the network.构建网络图时会失败。

However, using但是,使用

tf.shape(tensor)

will work as it will return the size in the execution.将起作用,因为它将在执行中返回大小。 Might be useful for example if someone provides you batches of unknown sizes which can easily happen if you don't have data divisible by your batch_size and you need to work with the dimension of your batch.例如,如果有人为您提供未知大小的批次,如果您没有可被您的 batch_size 整除的数据并且您需要使用批次的维度,这很容易发生。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM