简体   繁体   English

Numpy / Tensorflow:将3D张量的每个深度矢量乘以2D矩阵

[英]Numpy/Tensorflow: Multiplying each depth-wise vector of 3D tensor by a 2D matrix

I have a 4x4x256 tensor and a 128x256 matrix. 我有一个4x4x256张量和一个128x256矩阵。 I need to multiply each 256-d depth-wise vector of the tensor by the matrix, such that I get a 4x4x128 tensor as a result. 我需要将张量的每个256-d深度方向矢量乘以矩阵,以便得到4x4x128张量。

Working in Numpy it's not clear to me how to do this. 在Numpy中工作对我来说还不清楚。 In their current shape it doesn't look like any variant of np.dot exists to do this. 以它们当前的形状,看起来不存在任何np.dot变体可以做到这一点。 Manipulating the shapes to take advantage of broadcasting rules doesn't seem to provide any help. 操纵形状以利用广播规则似乎并没有提供任何帮助。 np.tensordot and np.einsum may be useful but looking at the documentation is going right over my head. np.tensordotnp.einsum可能很有用,但查看文档的过程就变得np.tensordot np.einsum

Is there an efficient way to do this? 有一种有效的方法可以做到这一点吗?

You can use np.einsum to do this operation. 您可以使用np.einsum进行此操作。 An example with random values: 一个带有随机值的示例:

a = np.arange(4096.).reshape(4,4,256)
b = np.arange(32768.).reshape(128,256)
c = np.einsum('ijk,lk->ijl',a,b)
print(c.shape)

Here, the subscripts argument is: ijk,lk->ijl 在这里, subscripts参数为: ijk,lk->ijl
From your requirement, i=4, j=4, k=256, l=128 根据您的要求, i=4, j=4, k=256, l=128
The comma separates the subscripts for two operands, and the subscripts state that the multiplication should be performed over the last subscript in each tensor (the subscript k which is common to both the tensors). 逗号分隔两个操作数的下标,并且下标指出应在每个张量中的最后一个下标(两个张量都共用的下标k上执行乘法。

The tensor subscript after the -> states that the resultant tensor should have the shape (i,j,l) . ->之后的张量下标表示所得张量应具有(i,j,l)的形状。 Now depending on the type of operation you are performing, you might have to retain this subscript or change this subscript to jil , but the rest of the subscripts remains the same. 现在,根据您要执行的操作类型,您可能必须保留此下标或将此下标更改为jil ,但其余下subscripts保持不变。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM