[英]Optimized projection of a matrix orthogonaly to a vector with Numpy
I need to make all other columns of a matrix A
orthogonal to one of its column j
. 我需要使矩阵
A
所有其他列正交于其列j
。
I use the following algorithm : 我使用以下算法:
# Orthogonalize with selected column
for i in remaining_cols:
A[:,i] = A[:,i] - A[:,j] * np.dot(A[:,i], A[:,j]) / np.sum(A[:,j]**2)
The idea comes from the QR decomposition with the Gram-Schmidt process . 这个想法来自用Gram-Schmidt过程进行QR分解 。
But this code is not optimized and unstable because of the Gram-Schmidt process. 但是由于Gram-Schmidt过程,该代码并未经过优化且不稳定。
Does Numpy provide any method to compute the orthogonal projection of those vectors ? Numpy是否提供任何方法来计算这些向量的正交投影?
I heard that the Householder Reflectors are used in numpy.linalg.qr
. 我听说
numpy.linalg.qr
中使用了Householder Reflectors。 This would allow me to compute an orthogonal matrix Q
so that 这将允许我计算正交矩阵
Q
以便
Q * A[:,j] = [0 ... 0 1 0 ... 0]
|
j_th coordinate
I would only have to ignore the line j
and multiply back with QT
. 我只需要忽略第
j
行,然后乘以QT
。
Is there a method to obtain the Householder Matrix with Numpy ? 有没有一种方法可以得到带有Numpy的Householder矩阵? I mean without coding the algorithm by hand.
我的意思是不手动编写算法。
IIUC, here could be a vectorized way: IIUC,这里可能是向量化方式:
np.random.seed(10)
B = np.random.rand(3,3)
col = 0
remaining_cols = [1,2]
#your method
A = B.copy()
for i in remaining_cols:
A[:,i] = A[:,i] - A[:,col] * np.dot(A[:,i], A[:,col]) / np.sum(A[:,col]**2)
print (A)
[[ 0.77132064 -0.32778252 0.18786796]
[ 0.74880388 0.16014712 -0.2079702 ]
[ 0.19806286 0.67103261 0.05464156]]
# vectorize method
A = B.copy()
A[:,remaining_cols] -= (A[:,col][:,None] * np.sum(A[:,remaining_cols]* A[:,col][:,None], axis=0)
/ np.sum(A[:,col]**2))
print (A) #same result
[[ 0.77132064 -0.32778252 0.18786796]
[ 0.74880388 0.16014712 -0.2079702 ]
[ 0.19806286 0.67103261 0.05464156]]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.