简体   繁体   中英

OpenCV Stereo Matching Essential Matrix weird values

I have a stereo setup using OpenCV and two webcams. I computed essential and fundamental matrices, intrinces extrinces etc using BM correspondancy algorithm. Now I want to find the matching point of a pixel in left image in the other image. To do this I have defined the following function, which is incomplete since my primary aim is to calculate real world distance.

void StereoVision::findEpipolarLineForXY(int x, int y ,int lr)
{

if(calibrationDone)
{
    CvPoint3D32f p1={x,y,1};
    qDebug("%d,_,_,%d",p1.x,p1.y);

    CvMat pt1=cvMat(3,1,CV_64FC1,&p1);
    qDebug("-");
    CvMat e=_E;
    qDebug("pt1:");
    PrintMat(&pt1);
    qDebug("e:");
    PrintMat(&e);

    //CvMat * corLine;
    //CvMat* pt2=e*pt1;

    CvMat *pt2 = cvCreateMat( e.rows, pt1.cols, CV_64FC1);
    qDebug("pt2:");
    PrintMat(pt2);
    qDebug("--%d--->%d",pt2->rows,pt2->cols);

    cvMatMul( &e, &pt1, pt2 );

    qDebug("--%d--->%d",pt2->cols,pt2->data);
    //const CvMat* f=&_F;
    qDebug("---");
    //cvComputeCorrespondEpilines(&mat,lr,f,corLine);
    qDebug("----");
    //qDebug("%d,,,%d",corLine->height,corLine->rows);

    }

}


void StereoVision::PrintMat(CvMat *A)
{
int i, j;

for (i = 0; i < A->rows; i++)
{
    QDebug dbg(QtDebugMsg);
    dbg<<"\n";
    switch (CV_MAT_DEPTH(A->type))
    {
    case CV_32F:
    case CV_64F:
        for (j = 0; j < A->cols; j++)
            dbg <<"%8.3f "<< ((float)cvGetReal2D(A, i, j));
        break;
    case CV_8U:
    case CV_16U:
        for(j = 0; j < A->cols; j++)
            dbg <<"%6d"<<((int)cvGetReal2D(A, i, j));
        break;
    default:
        break;
    }
    dbg.~QDebug();
}
qDebug("");
}

I want to know why essential matrix is a bad one? all output is below:

350, , ,317

0, , ,1081466880

-

pt1:

%8.3f 350

%8.3f 317

%8.3f 1

e:

%8.3f 0 %8.3f inf %8.3f 0

%8.3f 0 %8.3f 0 %8.3f 0

%8.3f 0 %8.3f 0 %8.3f 0

pt2:

%8.3f -inf

%8.3f -inf

%8.3f -inf

--3--->1

--1--->44201616



Also Id like to know if im on the right path to find the 3D distance of the pixel in real world coordinates?

You should look up Stereo Ranging .

If you have disparity pixel value, ie horizontal pixel distance between two points in two frames, you can find out the real world depth of that point (with respect to camera baseline).

focal_length_pixels = focal_length_mm * sensor_pixels_per_mm;
distance_mm = baseline_mm * focal_length_pixels / disparity_pixels;

disparity_pixels - Horizontal pixel distance between two frames (for that point). eg. If the point in left image is (100, 150) and in the second image is (125, 160) , disparity_pixel = 25

You can get focal_length_mm from your camera specifications.

focal_length_pixels = distance_mm * disparity_pixels / baseline_mm;
sensor_pixels_per_mm = focal_length_pixels / focal_length_mm;

Keep an object at distance x mm from the baseline of the camera. And get the disparity_pixels as shown above. You know the baseline_mm . This will give you focal_length_pixels and sensor_pixels_per_mm . Read this .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM