简体   繁体   English

从视差图获得真实深度

[英]Getting real depth from disparity map

I want to get real distance of an object from stereo camera. 我想从立体摄像机获得物体的真实距离。 I am using OpenCV example code as given in Learning OpenCV O'Reilly book. 我正在使用学习OpenCV O'Reilly书中给出的OpenCV示例代码。 After getting disparity map I want to use the formula: 得到视差图后,我想使用公式:

distance = focal_length * baseline distance / disparity 距离=焦距*基线距离/视差

The problem is : 问题是 :

  1. I am getting negative values of disparities. 我得到了差距的消极价值。 How to convert these values so that they might be used in actual depth calculation ? 如何转换这些值,以便可以在实际深度计算中使用它们?

  2. In above formula focal length and baseline distance are in mm (returned by reprojection matrix) whereas disparity will be in pixels. 在上面的公式中,焦距和基线距离以mm为单位(由重投影矩阵返回),而视差将以像素为单位。 So the result will be in mm^2/pixel. 因此结果将以mm ^ 2 /像素为单位。 How to convert disparity value from pixel to mm. 如何将视差值从像素转换为毫米。

you can use CVs stereo correspondence functions, such as Stereo Block Matching or Semi Global Block Matching. 您可以使用CV的立体声对应功能,例如“立体声块匹配”或“半全局块匹配”。 This will give you a disparity map for the entire image which can be transformed to 3D points using the Q matrix (cv::reprojectImageTo3D). 这将为您提供整个图像的视差图,可以使用Q矩阵(cv :: reprojectImageTo3D)将其转换为3D点。

There are two problems here: 这里有两个问题:

  1. The units. 单位。 Assuming that your cameras are calibrated you have the focal length in pixels , so the unit of disparity is in mm or any other metric unit. 假设您的相机已校准,则焦距以像素为单位 ,因此视差单位为mm或任何其他公制单位。 See my answer to the same question here . 在这里查看我对相同问题的回答。

  2. Disparity cannot be negative if the 3D point is in front of both cameras that are parallel to each other. 如果3D点位于两个彼此平行的摄像机的前面,则视差不能为负。 If the axes are converged or the cameras are uncalibrated it may happen. 如果轴会聚或相机未校准,则可能会发生。 See this question for details. 有关详细信息,请参见此问题

You need to perform a camera calibration step first, to get a proper parameters, know as camera matrix . 您需要首先执行相机校准步骤,以获取适当的参数,即相机矩阵

One you have these values, you can perform a proper computation of the disparity map with the corrected values (the one got from the camera calibration step, know as remapping or undistortion ) and then, to obtain the real depth (in mm or meters), you can finally do: 有了这些值,就可以使用校正后的值(从相机校准步骤获得的值,称为重新映射不失真 )对视差图进行适当的计算,然后获得真实深度(以毫米或米为单位) ,您终于可以做到:

depth = baseline * focal / disparity

The simple formula for converting disparities to distances is written for the case of parallel cameras, and with the disparities expressed in metrical units (mm, in your case). 对于平行摄像机,会写出将视差转换为距离的简单公式,并且视差以公制单位(在您的情况下为mm)表示。

In practice, using OpenCV, and assuming that your stereo rig is calibrated, you will perform a dense triangulation, using a routine like reprojectImageTo3D 在实践中,使用OpenCV并假设您的立体声装备已校准,您将使用诸如reprojectImageTo3D之类的例程执行密集三角剖分

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM