简体   繁体   English

在 PyTorch 中获取张量的 autograd 计数器

[英]Getting the autograd counter of a tensor in PyTorch

I am using PyTorch for training a network.我正在使用 PyTorch 来训练网络。 I was going through the autograd documentation and here it is mentioned that for each tensor there is a counter that the autograd implements to track the "version" of any tensor.我正在阅读 autograd 文档, 这里提到每个张量都有一个计数器,autograd 实现了一个计数器来跟踪任何张量的“版本”。 How can I get this counter for any tensor in the graph?我怎样才能为图中的任何张量获得这个计数器?

Reason why I need it.我需要它的原因。

I have encountered the autograd error我遇到了 autograd 错误

[torch.cuda.FloatTensor [x, y, z]], which is output 0 of torch::autograd::CopySlices, is at version 7; [torch.cuda.FloatTensor [x, y, z]],是torch::autograd::CopySlices的输出0,版本7; expected version 6 instead.而是预期的版本 6。 Hint: the backtrace further above shows the operation that failed to compute its gradient.提示:上面的回溯显示了未能计算其梯度的操作。 The variable in question was changed in there or anywhere later.有问题的变量在那里或以后的任何地方发生了变化。 Good luck!祝你好运!

This is not new to me and I have been successful in handling it before.这对我来说并不新鲜,而且我以前成功地处理过它。 This time around I am not able to see why the tensor would be at version 7 instead of being at 6. To answer this, I would want to know the version at any given point in the run.这一次我不明白为什么张量会是第 7 版而不是第 6 版。要回答这个问题,我想知道运行中任何给定点的版本。

Thanks.谢谢。

It can be obtained through the command tensor_name._version .它可以通过命令tensor_name._version获得。

As an example of how to use it, following MSE is provided.作为如何使用它的示例,提供了以下 MSE。

import torch

a = torch.zeros(10, 5)
print(a._version)  # prints 0
a[:, 1] = 1
print(a._version)  # prints 1  

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 pytorch autograd:以可微的方式从坐标张量获取像素网格张量 - pytorch autograd : getting pixel grid tensor from coordinates tensor in a differentiable way 如何以支持 autograd 的方式围绕其中心旋转 PyTorch 图像张量? - How do I rotate a PyTorch image tensor around it's center in a way that supports autograd? Pytorch。 当最终张量中包含多个值时,可以使用autograd吗? - Pytorch. Can autograd be used when the final tensor has more than a single value in it? 有没有一种好方法可以修改 pytorch 张量中的某些值,同时保留 autograd 功能? - Is there a good way to modify some values in a pytorch tensor while preserving the autograd functionality? PyTorch-修改autograd变量 - PyTorch - modifications of autograd variables PyTorch 用于回归的 Autograd - PyTorch Autograd for Regression PyTorch 中的 autograd 微分示例 - 应该是 9/8? - autograd differentiation example in PyTorch - should be 9/8? PyTorch Autograd自动微分功能 - PyTorch Autograd automatic differentiation feature PyTorch autograd:自定义函数梯度的维度? - PyTorch autograd: dimensionality of custom function gradients? Pytorch 中基于磁带的 autograd 是什么? - What is tape-based autograd in Pytorch?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM