简体   繁体   中英

Convolution in PyTorch with non-trainable pre-defined kernel

I would like to introduce a custom layer to my neural network. The mathematical operation should be a discrete 2D cross correlation (or convolution) with a non-trainable kernel. The values in the kernel depend on three things: kernel shape, strides and padding. I intend to multiply the output element-wise with a weight matrix.

PyTorch already has an implementation of a discrete 2D cross correlation class called 'Conv2d', however it generates a random kernel and trains using the entries of said kernel. If possible I would like a class similar to 'Conv2d' that does what I need, to make sure to use my GPU most effectively. I tried implementing this on my own, but couldn't figure out how to obtain the correct shapes for the input array. 'Conv2d' only uses 'in_channels', if I understood correctly.

If I understand correctly,you want a Conv2d layer with defined kernel and you don't want it to be learnable.

In that case,you can use the conv2d function like this:

import torch.nn.functional as F
output_tensor = F.conv2d(input_tensor, your_kernel, ...)

the parameter your_kernel is your weight matrix,also you need to declare other parameters of the function like padding and stride

Then you need to set the requires_grad attribute to 'False', and exclude it from the optimizer if you don't want it to be learnable.

And about the shape issue,maybe you wanna check this out

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM