简体   繁体   English

在Tensorflow中,变量和张量之间有什么区别?

[英]In Tensorflow, what is the difference between a Variable and a Tensor?

The Tensorflow documentation states that a Variable can be used any place a Tensor can be used, and they seem to be fairly interchangeable. Tensorflow文档声明可以在任何可以使用Tensor地方使用Variable ,它们似乎是可以互换的。 For example, if v is a Variable , then x = 1.0 + v becomes a Tensor . 例如,如果vVariable ,则x = 1.0 + v变为Tensor

What is the difference between the two, and when would I use one over the other? 两者之间有什么区别,何时我会使用另一个?

It's true that a Variable can be used any place a Tensor can, but the key differences between the two are that a Variable maintains its state across multiple calls to run() and a variable's value can be updated by backpropagation (it can also be saved, restored etc as per the documentation). 确实可以在Tensor可以使用的任何地方使用变量,但两者之间的关键差异是变量在多次调用run()时保持其状态,变量的值可以通过反向传播更新(也可以保存) ,根据文件恢复等)。

These differences mean that you should think of a variable as representing your model's trainable parameters (for example, the weights and biases of a neural network), while you can think of a Tensor as representing the data being fed into your model and the intermediate representations of that data as it passes through your model. 这些差异意味着您应该将变量视为表示模型的可训练参数 (例如,神经网络的权重和偏差),而您可以将Tensor视为表示馈入模型的数据和中间表示通过模型时的数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM