简体   繁体   English

使用 3D 和二维点对应计算旋转和平移矩阵

[英]computing rotation and translation matrix with 3D and 2D Point correspondences

I have a set of 3D points and the correspondend point in 2D from a diffrent position.我有一组 3D 点和来自不同 position 的二维对应点。
The 2D points are on a 360° panorama. 2D 点位于 360° 全景图上。 So i can convert them to polar -> (r,theta, phi ) with no information about r.所以我可以在没有关于 r 的信息的情况下将它们转换为极坐标 -> (r,theta, phi)。

But r is just the distance of the transformed 3D Point:但是 r 只是变换后的 3D 点的距离:

[R|t]*xyz = xyz' [R|t]*xyz = xyz'
r = sqrt(xyz') r = sqrt(xyz')

Then with the 3D point also in spherical coordinates, i can now search for R and t with this linear equation system:然后在球坐标中使用 3D 点,我现在可以使用此线性方程组搜索 R 和 t:

x' = sin(theta) * cos(phi) * r x' = sin(theta) * cos(phi) * r
y' = sin(theta) * cos(phi) * r y' = sin(theta) * cos(phi) * r
z' = sin(theta) * cos(phi) * r z' = sin(theta) * cos(phi) * r

I get good results for tests with t=[0,0,0.5] and without any rotation.对于 t=[0,0,0.5] 且没有任何旋转的测试,我得到了很好的结果。 But if there is a rotation the results are bad.但如果有轮换,结果很糟糕。

Is this the correct approach for my problem?这是解决我的问题的正确方法吗?

How can I use solvepnp() without a camera Matrix (it is a panorama without distortion)?如何在没有相机矩阵的情况下使用 solvepnp()(它是没有失真的全景图)?

I am using opt.least_squares to calculate R and t.我正在使用 opt.least_squares 来计算 R 和 t。

I solved it with two diffrent methods.我用两种不同的方法解决了它。

One is for small rotations and solves for R and t (12 parameter), the other method can compute even big rotations with Euler and t (6 parameter).一种是针对小旋转并求解 R 和 t(12 参数),另一种方法可以使用欧拉和 t(6 参数)计算甚至大的旋转。

I am calling the opt.least_squares() two times with diffrent initial values and use the method with an better reprojection error.我用不同的初始值调用opt.least_squares()两次,并使用具有更好重投影错误的方法。

The f.eul2rot is just a conversion between euler angles and the rotation matrix. f.eul2rot 只是欧拉角和旋转矩阵之间的转换。

def sphere_eq(p):
    xyz_points = xyz
    uv_points = uv
    #r11,r12,r13,r21,r22,r23,r31,r32,r33,tx,ty,tz = p
    if len(p) == 12:
        r11, r12, r13, r21, r22, r23, r31, r32, r33, tx, ty, tz = p
        R = np.array([[r11, r12, r13],
                      [r21, r22, r23],
                      [r31, r32, r33]])
    else:
        gamma, beta, alpha,tx,ty,tz = p
        E = [gamma, beta, alpha]
        R = f.eul2rot(E)
    pi = np.pi
    eq_grad = ()
    for i in range(len(xyz_points)):
        # Point with Orgin: LASER in Cartesian and Spherical coordinates
        xyz_laser = np.array([xyz_points[i,0],xyz_points[i,1],xyz_points[i,2]])

        # Transformation - ROTATION MATRIX and Translation Vector
        t = np.array([[tx, ty, tz]])

        # Point with Orgin: CAMERA in Cartesian and Spherical coordinates
        uv_camera = np.array(uv_points[i])
        long_camera = ((uv_camera[0]) / w) * 2 * pi
        lat_camera = ((uv_camera[1]) / h) * pi

        xyz_camera = (R.dot(xyz_laser) + t)[0]
        r = np.linalg.norm(xyz_laser + t)

        x_eq = (xyz_camera[0] - (np.sin(lat_camera) * np.cos(long_camera) * r),)
        y_eq = (xyz_camera[1] - (np.sin(lat_camera) * np.sin(long_camera) * r),)
        z_eq = (xyz_camera[2] - (np.cos(lat_camera) *                       r),)
        eq_grad = eq_grad + x_eq + y_eq + z_eq

    return eq_grad

x = np.zeros(12)
x[0], x[4], x[8] = 1, 1, 1
initial_guess = [x,np.zeros(6)]

for p, x0 in enumerate(initial_guess):
    x = opt.least_squares(sphere_eq, x0, '3-point', method='trf')
    if len(x0) == 6:
        E = np.resize(x.x[:4], 3)
        R = f.eul2rot(E)
        t = np.resize(x.x[4:], (3, 1))
    else:
        R = np.resize(x.x[:8], (3, 3))
        E = f.rot2eul(R)
        t = np.resize(x.x[9:], (3, 1))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM