簡體   English   中英

如何通過點雲/深度感知在對象上設置點?

[英]How to set a point on an object via point-cloud/depth perception?

問題:我的應用程序無法通過單擊按鈕在(對象上的)最新點雲數據的點上/附近設置點。

點擊此鏈接可查看到目前為止我開發的android應用; 我從探戈點對點測量示例應用程序開發了我的應用程序

這個應用程式的功能...

  • 通過surfaceView顯示相機數據。
  • 右上角的紅色按鈕關閉應用程序。
  • 按下右中綠色按鈕應該在對象上設置一個點雲。
  • 左上方的下拉框允許您更改長度單位。
  • 中間目標是在按下綠色按鈕之后設置點的位置。

作為提示,我的錯誤源於getDepthAtCenter方法(在下面第二種方法的底部),因為“沒有深度點”。 在應用運行時將其發布在Android Studio的LogCat上。

這是setPoint方法(單擊綠色按鈕時)。

public void setPoint(View button) {
        MotionEvent buttonClick;
        final Button msrTarget = (Button) findViewById(R.id.target);

        button.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                //(x,y,...) coordinates for the the middle of the target (uncertain if this is right)
                float x = msrTarget.getX() / msrTarget.getWidth();
                float y = msrTarget.getY() / msrTarget.getHeight();

                // z, you need to deduce how to calculate the distance from the tablet to the object under measurement
                // First,
                try {
                    // Place point near the clicked point using the latest point cloud data.
                    // Synchronize against concurrent access to the RGB timestamp in the OpenGL thread
                    // and a possible service disconnection due to an onPause event.
                    MeasuredPoint newMeasuredPoint;
                    synchronized (this) {
                        //z depth
                        newMeasuredPoint = getDepthAtCenter(x, y);

                    }
                    if (newMeasuredPoint != null) {
                        // Update a line endpoint to the new location.
                        // This update is made thread-safe by the renderer.
                        updateLine(newMeasuredPoint);
                        Log.w(TAG, "Point was Updated.");
                    } else {
                        Log.w(TAG, "Point was Null.");
                    }

                } catch (TangoException t) {
                    Toast.makeText(getApplicationContext(),
                            R.string.failed_measurement,
                            Toast.LENGTH_SHORT).show();
                    Log.e(TAG, getString(R.string.failed_measurement), t);
                } catch (SecurityException t) {
                    Toast.makeText(getApplicationContext(),
                            R.string.failed_permissions,
                            Toast.LENGTH_SHORT).show();
                    Log.e(TAG, getString(R.string.failed_permissions), t);
                }
            }
        });
    }

這是getDepthAtCenter方法。

// Using the Tango Support Library with point-cloud data to calculate the depth
    // of the point. It returns Vector3 in OpenGL world space.
    private MeasuredPoint getDepthAtCenter(float x, float y) {
        TangoPointCloudData pointCloud = pointCloudManager.getLatestPointCloud();

        // There is no point cloud
        if (pointCloud == null) {
            Log.w(TAG, "No Point Cloud.");
            return null;
        }

        double rgbTimestamp;
        TangoImageBuffer imageBuffer = mCurrentImageBuffer;
        if (bilateralCheckbox.isChecked()) {
            rgbTimestamp = imageBuffer.timestamp; // CPU.
        } else {
            rgbTimestamp = mRgbTimestampGlThread; // GPU.
        }

        TangoPoseData depthlTcolorPose = TangoSupport.getPoseAtTime(
                rgbTimestamp,
                TangoPoseData.COORDINATE_FRAME_CAMERA_DEPTH,
                TangoPoseData.COORDINATE_FRAME_CAMERA_COLOR,
                TangoSupport.ENGINE_TANGO,
                TangoSupport.ENGINE_TANGO,
                TangoSupport.ROTATION_IGNORED);
        if (depthlTcolorPose.statusCode != TangoPoseData.POSE_VALID) {
            Log.w(TAG, "Could not get color camera transform at time " + rgbTimestamp);
            return null;
        }

        float[] depthPoint;
        //if the bilateral checkbox is checked, get the depth at point bilateral
        if (bilateralCheckbox.isChecked()) {
            depthPoint = TangoDepthInterpolation.getDepthAtPointBilateral(
                    pointCloud,
                    new double[]{0.0, 0.0, 0.0},
                    new double[]{0.0, 0.0, 0.0, 1.0},
                    imageBuffer,
                    x, y,
                    displayRotation,
                    depthlTcolorPose.translation,
                    depthlTcolorPose.rotation);
        } else {
            //Otherwise, get the nearest neighbour as the point-cloud point
            depthPoint = TangoDepthInterpolation.getDepthAtPointNearestNeighbor(
                    pointCloud,
                    new double[]{0.0, 0.0, 0.0},
                    new double[]{0.0, 0.0, 0.0, 1.0},
                    x, y,
                    displayRotation,
                    depthlTcolorPose.translation,
                    depthlTcolorPose.rotation);
        }

        // There is no depth point
        if (depthPoint == null) {
            Log.w(TAG, "No Depth Point.");
            return null;
        }

        return new MeasuredPoint(rgbTimestamp, depthPoint);
    }

這是TangoDepthInterpolation類...

package com.google.tango.depthinterpolation;

import com.google.atap.tangoservice.Tango;
import com.google.atap.tangoservice.TangoInvalidException;
import com.google.atap.tangoservice.TangoPointCloudData;
import com.google.atap.tangoservice.TangoPoseData;
import com.google.atap.tangoservice.experimental.TangoImageBuffer;
import com.google.tango.depthinterpolation.TangoDepthInterpolationJNIInterface.ByteDepthBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

public class TangoDepthInterpolation {
    private static final String TAG = TangoDepthInterpolation.class.getSimpleName();

    public TangoDepthInterpolation() {
    }

    public static float[] getDepthAtPointNearestNeighbor(TangoPointCloudData pointCloud, double[] pointCloudTranslation, double[] pointCloudOrientation, float u, float v, int displayRotation, double[] colorCameraTranslation, double[] colorCameraOrientation) throws TangoInvalidException {
        float[] colorCameraPoint = new float[3];
        int result = TangoDepthInterpolationJNIInterface.getDepthAtPointNearestNeighbor(pointCloud, pointCloudTranslation, pointCloudOrientation, u, v, displayRotation, colorCameraTranslation, colorCameraOrientation, colorCameraPoint);
        if(result == -1) {
            return null;
        } else {
            if(result != 0) {
                Tango.throwTangoExceptionIfNeeded(result);
            }

            return colorCameraPoint;
        }
    }

    public static float[] getDepthAtPointBilateral(TangoPointCloudData pointCloud, double[] pointCloudTranslation, double[] pointCloudOrientation, TangoImageBuffer imageBuffer, float u, float v, int displayRotation, double[] colorCameraTranslation, double[] colorCameraOrientation) throws TangoInvalidException {
        float[] colorCameraPoint = new float[3];
        int result = TangoDepthInterpolationJNIInterface.getDepthAtPointBilateral(pointCloud, pointCloudTranslation, pointCloudOrientation, imageBuffer, u, v, displayRotation, colorCameraTranslation, colorCameraOrientation, colorCameraPoint);
        if(result == -1) {
            return null;
        } else {
            if(result != 0) {
                Tango.throwTangoExceptionIfNeeded(result);
            }

            return colorCameraPoint;
        }
    }
}

通過在我的主要班級中輸入此深度感知配置,我解決了自己的問題。

try {
    mConfig = new TangoConfig();
    mConfig = mTango.getConfig(TangoConfig.CONFIG_TYPE_CURRENT);
    mConfig.putBoolean(TangoConfig.KEY_BOOLEAN_DEPTH, true);
} catch (TangoErrorException e) {
    // handle exception
}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM