简体   繁体   English

pytorch autograd 阻止脚本终止

[英]pytorch autograd obstructs script from terminating

Whenever I call autograds backward my script never terminates.每当我backward调用 autograds 时,我的脚本永远不会终止。 backward is not per se blocking, all lines after it are still executed, the script just does not terminate. backward本身并不阻塞,它之后的所有行仍然被执行,脚本只是不会终止。 It appears that there is some sort of working thread in the background which hangs, but I was not able to find any information about it.后台似乎有某种工作线程挂起,但我找不到有关它的任何信息。

I originally encountered the problem while training neural networks, however I eventually found a very short example with the same behavior:我最初在训练神经网络时遇到了这个问题,但是我最终找到了一个具有相同行为的非常短的例子:

import torch

x = torch.randn(3, requires_grad=True)
y = x * 2
print(y)

gradients = torch.tensor([0.1, 1.0, 0.0001], dtype=torch.float)
y.backward(gradients)
print(x.grad)

print("all done")

When I remove the backward line, the script finishes as expected.当我删除backward行时,脚本按预期完成。 Otherwise I see a process call python in the task manager, if I terminate it by hand, the script execution also terminates.否则我在任务管理器中看到一个进程调用python ,如果我手动终止它,脚本执行也会终止。

I installed pytorch on Windows 7 using conda ( conda create --name grad_test pytorch -c pytorch ) in the most recent, stable version (python 3.7, pytorch 1.2.0).我在 Windows 7 上使用conda create --name grad_test pytorch -c pytorch ) 在最新的稳定版本 (python 3.7, pytorch 1.2.0) 上安装了 pytorch。

它现在仍然存在,似乎是 Windows 7 特定的问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM