简体   繁体   English

在Pytorch中连接两个具有不同尺寸的张量

[英]Concatenating two tensors with different dimensions in Pytorch

Is it possible to concatenate two tensors with different dimensions without using for loop. 是否可以在不使用for循环的情况下连接两个具有不同尺寸的张量。

eg Tensor 1 has dimensions (15, 200, 2048) and Tensor 2 has dimensions (1, 200, 2048). 例如,张量1具有尺寸(15,200,2048),张量2具有尺寸(1,200,2048)。 Is it possible to concatenate 2nd tensor with 1st tensor along all the 15 indices of 1st dimension in 1st Tensor (Broadcast 2nd tensor along 1st dimension of Tensor 1 while concatenating along 3rd dimension of 1st tensor)? 是否有可能沿第一张量的第一维度的所有15个指数(沿张量1的第一维广播第二张量,同时沿第一张量的第三维连接)连接第二张量和第一张量? The resulting tensor should have dimensions (15, 200, 4096). 得到的张量应该具有尺寸(15,200,4096)。

Is it possible to accomplish this without for loop ? 是否有可能在没有循环的情况下实现这一点?

You could do the broadcasting manually (using Tensor.expand() ) before the concatenation (using torch.cat() ): 您可以在连接之前手动进行广播(使用Tensor.expand() )(使用torch.cat() ):

import torch

a = torch.randn(15, 200, 2048)
b = torch.randn(1, 200, 2048)

repeat_vals = repeat_vals = [a.shape[0] // b.shape[0]] + [-1] * (len(b.shape) - 1)
# or directly repeat_vals = (15, -1, -1) or (15, 200, 2048) if shapes are known and fixed...
res = torch.cat((a, b.expand(*repeat_vals)), dim=-1)
print(res.shape)
# torch.Size([15, 200, 4096])

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM