简体   繁体   中英

PyTorch - modifications of autograd variables

In my PyTorch program, I have a matrix which is continuously updated over runtime.

I wonder how to perform this update. I tried using something like this:

matrix[0, index] = hidden[0]

Both matrix and hidden are autograd Variables. When using the example above I'm getting the following error message:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

I wonder how to get around this and perform the update without using in-place operations.

Probably slicing the matrix and creating a new matrix with torch.cat would work, but this doesn't seem to be a very nice solution.

Is there a better way doing this?

Thanks in advance!

Maybe posting a piece of code could help, but have you tried using a dataset ? You could sequentially run through data efficiently with it.

http://pytorch.org/docs/master/data.html#torch.utils.data.TensorDataset

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM