简体   繁体   中英

Pad multiple torch tensor over the last dim

I have multiple torch tensors with the following shapes

x1 = torch.Size([1, 512, 177])
x2 = torch.Size([1, 512, 250])
x3 = torch.Size([1, 512, 313])

How I can pad all these tensors by 0 over the last dimension, to have a unique shape like ([1, 512, 350]).

What I tried to do is to convert them into NumPy arrays and use these two lines of code:

if len(x1) < 350:
            ff = np.pad(f, [(0, self.max_len - f.shape[0]), ], mode='constant')
            f = ff

But unfortunately, it doesn't affect the last dim and still, the shapes are not equal. Any help will be appreciated Thanks

You can simply do:

import torch.nn.functional as F

x = F.pad(x, (0, self.max_len - x.size(2)), "constant", 0)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM