简体   繁体   中英

Pytorch copy a neuron in a layer

I am using pytorch 0.3.0. I'm trying to selectively copy a neuron and it's weights within the same layer, then replace the original neuron with an another set of weights. Here's my attempt at that:

reshaped_data2 = data2.unsqueeze(0)
new_layer_data = torch.cat([new_layer.data, reshaped_data2], dim=0)
new_layer_data[i] = data1
new_layer.data.copy_(new_layer_data)

First I unsqueezed data2 to make it a 1*X tensor instead of 0*X . Then I concatenate my layer's tensor with the reshaped data2 along dimension 0. I then replace the original data2 located at index i with data1 . Finally, I copy all of that into my layer.

The error I get is:

RuntimeError: inconsistent tensor size, expected tensor [10 x 128] and src [11 x 128] to have the same number of elements, but got 1280 and 1408 elements respectively at /Users/soumith/code/builder/wheel/pytorch-src/torch/lib/TH/generic/THTensorCopy.c:86

If I do a simple assignment instead of copy I get

RuntimeError: The expanded size of the tensor (11) must match the existing size (10) at non-singleton dimension 1. at /Users/soumith/code/builder/wheel/pytorch-src/torch/lib/TH/generic/THTensor.c:309

I understand the error, but what is the right way to go about this?

You're trying to replace a 10x128 tensor with a 11x128 tensor, which the model doesn't allow. Is new_layer initialised with the size (11, 128) ? If not, try creating your new layer with your desired size (11, 128) and then copy/assign your new_layer_data .

The solution here is to create a new model with the correct size and pass in weights as default values. No dynamic expansion solution was found.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM