簡體   English   中英

將Tango 3D投影到屏幕上的Google Project Tango

[英]projecting Tango 3D point to screen Google Project Tango

Ptoject Tango提供了一個點雲,如何獲得以米為單位的3D點在像素中的位置?

我嘗試使用投影矩陣,但得到的值非常小(0.5、1.3等),而不是1234324(以像素為單位)。

我包括了我嘗試過的代碼

    //Get the current rotation matrix
    Matrix4 projMatrix =  mRenderer.getCurrentCamera().getProjectionMatrix();



    //Get all the points in the pointcloud and store them as 3D points
    FloatBuffer pointsBuffer =  mPointCloudManager.updateAndGetLatestPointCloudRenderBuffer().floatBuffer;
    Vector3[] points3D = new Vector3[pointsBuffer.capacity()/3];

    int j =0;
    for (int i = 0; i < pointsBuffer.capacity() - 3; i = i + 3) {

        points3D[j]= new Vector3(
                pointsBuffer.get(i),
                pointsBuffer.get(i+1),
                pointsBuffer.get(i+2));
        //Log.v("Points3d", "J: "+ j + " X: " +points3D[j].x + "\tY: "+ points3D[j].y +"\tZ: "+ points3D[j].z );
        j++;
    }


    //Get the projection of the points in the screen.
    Vector3[] points2D = new Vector3[points3D.length];
    for(int i =0; i < points3D.length-1;i++)
    {
        Log.v("Points", "X: " +points3D[i].x + "\tY: "+ points3D[i].y +"\tZ: "+ points3D[i].z );
        points2D[i] = points3D[i].multiply(projMatrix);
        Log.v("Points", "pX: " +points2D[i].x + "\tpY: "+ points2D[i].y +"\tpZ: "+ points2D[i].z );
    }

我正在使用的示例是點雲Java,可以在這里找到https://github.com/googlesamples/tango-examples-java


更新

TangoCameraIntrinsics ccIntrinsics = mTango.getCameraIntrinsics(TangoCameraIntrinsics.TANGO_CAMERA_COLOR);
    double fx = ccIntrinsics.fx;
    double fy = ccIntrinsics.fy;
    double cx = ccIntrinsics.cx;
    double cy = ccIntrinsics.cy;

    double[][] projMatrix = new double[][] {
            {fx, 0 , -cx},
            {0,  fy, -cy},
            {0,  0,    1}
    };

然后計算投影點

for(int i =0; i < points3D.length-1;i++)
    {

        double[][] point = new double[][] {
                {points3D[i].x},
                {points3D[i].y},
                {points3D[i].z}
        };

        double [][] point2d = CustomMatrix.multiplyByMatrix(projMatrix, point);

        points2D[i] = new Vector2(0,0);
        if(point2d[2][0]!=0)
        {
            Log.v("temp point", "pX: " +point2d[0][0]/point2d[2][0]+" pY: " +point2d[1][0]/point2d[2][0] );
            points2D[i] = new Vector2(point2d[0][0]/point2d[2][0],point2d[1][0]/point2d[2][0]);
        }

    }

但是我認為結果仍然不是預期的,例如我得到的結果如下:

pX:-175.58042313027244 pY:-92.573740812066

對我來說看起來不對。


更新按建議使用彩色攝像頭可獲得更好的結果,但點數仍為負-1127.8086915171814 pY:-652.5887102192332

將它們乘以-1是否可以?

您必須將3D點與RGB相機的本征矩陣相乘才能獲得像素坐標。 3D點位於Depthcamera的框架中。 您可以通過以下方法獲得像素坐標:

x和y是像素坐標。 使用內在函數用參數構造K

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM