简体   繁体   English

如何在不使用就地操作的情况下交换 PyTorch 张量中的值(保留梯度)

[英]How to swap values in PyTorch tensor without using in-place operation (conserve gradient)

I have a tensor called state of shape torch.Size([N, 2**n, 2**n]) , and I want to apply the following operations:我有一个名为state的张量,形状为torch.Size([N, 2**n, 2**n]) ,我想应用以下操作:

state[[0,1]] = state[[1,0]]
state[0] = -1*state[0]

Both of these are in-place operations.这两个都是就地操作。 Are there some out-of-place operations that I can substitute them with?是否有一些不合适的操作可以替代它们? These lines are inside a for-loop , so it would be a bit difficult to just create new variables.这些行位于 for-loop 内,因此仅创建新变量会有点困难。

I managed to figure it out!我设法弄明白了!

Replace:代替:

state[[0,1]] = state[[1,0]] # in-place operation

with:和:

state = state[[1,0]] # out-of-place operation

And for the second line, we replace:对于第二行,我们替换:

state[0] = -1*state[0] # in-place operation

with:和:

# out-of-place operations
temp = torch.ones(state.shape).type(state.type()).to(state.device)
temp[1] = -1*temp[1]
state = state*temp

This seems to be doing the job!这似乎在做这项工作!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM