简体   繁体   English

如何在该相机拍摄的图像中找到相机与像素点(x 和 y 坐标)之间的真实距离?

[英]how to find real life distance between a camera and a pixel point (x and y coordinates) in an image captured by that camera?

So I tried to implement logic by first considering that camera is placed on ground and captured image, I first calculated distance between bottom most pixel on image along y axis and px and py coordinates on image then I translated that distance into real life by multiplying resulted value with real life distance covered in a pixel of that image.因此,我尝试通过首先考虑将相机放置在地面上并捕获图像来实现逻辑,我首先计算图像上最底部像素沿 y 轴与图像上的 px 和 py 坐标之间的距离,然后我将该距离乘以结果转化为现实生活该图像的像素所覆盖的真实生活距离的值。 Now my task was to re calculate distance when now camera is lifted above at some height (making angle of 90 deg from ground).现在我的任务是重新计算当现在相机被提升到一定高度(与地面成 90 度角)时的距离。 Now how can I re calculate distance between new camera position (ie, at some height) and a point on image taken by that camera at new position?现在我如何重新计算新相机 position(即在某个高度)与该相机在新 position 拍摄的图像上的点之间的距离?

// point of image along y-axis until ground is visible 
const int groundY = 50;
// real life distance from origin (i.e. from where camera is placed and image bottom most point along y-axis) to point where ground is visible in image 
const int realLifeGroundDist = 200;
const int cameraHeight = 40;

void geoLocation(int ***a, int x, int y, int z, int px, int py){

    double onePixelDistance = realLifeGroundDist/groundY;
    // double distBtwPixels = sqrt(pow(px - ((x/2)-1), 2) + pow(py - (y-1), 2) * 1);

    // Distance formula to calculate dist btw origin and (px,py) pixel
    double distBtwPixels = sqrt(pow(px - (x-1), 2) + pow(py - ((y/2)-1), 2) * 1);
    
    //translating pixel distance to real life distance
    double h = distBtwPixels * onePixelDistance;

    //considering that camera is placed above ground at some height i.e. cameraHeight variable at angle = 90deg and calculating distance btw that point and camera i.e. hypteneuse
    double realLifeDist = sqrt(pow(h, 2) + pow(cameraHeight, 2));
    
    cout<<"Distance between camera and px and py coordinates in image = "<<realLifeDist;
}

According to my logic realLifeDist holds distance between point on image captured by camera placed on new location ie at some height from ground making angle of 90 deg from ground.根据我的逻辑,realLifeDist 保持放置在新位置的相机捕获的图像上的点之间的距离,即在距地面一定高度处,与地面成 90 度角。 Is my logic correct?我的逻辑正确吗? If not, then how can I calculate it?如果不是,那我该如何计算呢?

The short answer is, as long as you do not know the distance of your measured points from the camera, you can't.简短的回答是,只要您不知道测量点与相机的距离,就不能。

Imagine the image as a plane between the focus point of the camera and the objects you want to capture.将图像想象成相机焦点和您要捕捉的物体之间的平面。 When light is coming from an object (red points) to the focus point, they intersect the image plane.当光线从 object(红点)到达焦点时,它们与图像平面相交。 That intersection point is the location of the object on the image.该交点是图像上 object 的位置。

As you can see, objects that are further away from the camera appear closer together on the image plane.如您所见,距离相机较远的物体在图像平面上显得更近。 In this sketch, the close points have the same pixel distance as the far away points.在此草图中,近点与远点具有相同的像素距离。 So unless you know the distance between each point you want to measure and the camera, you cannot calculate the distance between these points from one image.因此,除非您知道要测量的每个点与相机之间的距离,否则您无法从一张图像中计算出这些点之间的距离。

在此处输入图像描述

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM