简体   繁体   中英

How to use multi-threading on Hololens

We're trying to use multi-threading on the Hololens, but fail. Since we do not know how and what to implement in the additional threads.

Currently our App is running to many operations in the main-thread that if we start our "live-stream" (working with WebcamTexture) the Holograms don't appear anymore.

So to start of with we want to ask how could we use threads run parts of our photocapture-code (seen below) more efficiently? So we can understand which parts to run in different threads.

We're using Unity 2018.4.10f1

using UnityEngine;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;

public partial class PhotocaptureFrame : MonoBehaviour
{

    public PhotoCapture photoCaptureObject = null;
    public GameObject quad;

    public static PhotocaptureFrame Instance { get; set; }

    private Texture2D imageTexture;
    private CameraParameters c;
    private Resolution cameraResolution;

    private void Start()
    {
        cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();

        PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject)
        {
            photoCaptureObject = captureObject;

            //CameraParameters c = new CameraParameters();
            c.hologramOpacity = 0.0f;
            c.cameraResolutionWidth = cameraResolution.width;
            c.cameraResolutionHeight = cameraResolution.height;
            c.pixelFormat = CapturePixelFormat.BGRA32;

            captureObject.StartPhotoModeAsync(c, delegate (PhotoCapture.PhotoCaptureResult result)
            {
                photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
            });

        });

        Instance = this;
    }

    public void MakePhoto()
    {
        PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject)
{
    photoCaptureObject = captureObject;

    captureObject.StartPhotoModeAsync(c, delegate (PhotoCapture.PhotoCaptureResult result)
    {
        photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
    });

});

    }

    public void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
    {

        Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First(); // Create our Texture2D for use and set the correct resolution
        Texture2D targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);

        photoCaptureFrame.UploadImageDataToTexture(targetTexture);  // Copy the raw image data into our target texture

        imageTexture = targetTexture;       //Save image to new Texture to not loose it

        quad.GetComponent<Renderer>().material.mainTexture = imageTexture;     // Do as we wish with the texture such as apply it to a material, etc.
                                                                               //photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoModeEnd);


        // Clean up
        photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoModeEnd);
    }

    public void OnStoppedPhotoModeEnd(PhotoCapture.PhotoCaptureResult result)
    {
        photoCaptureObject.Dispose();
        photoCaptureObject = null;
        Debug.Log("Photo object disposed.");

    }
}

Our expected output is to know which chunks we can run in different threads and how to call different threads on Hololens.

Thanks allot and every help is appreciated.

Although this does not directly answer your question, i can really recommed this plugin: HololensCameraStream which internally uses the MediaCapture class. Frames are aquired asynchronously meaning the Videocapture itself essentially runs in its own thread. Additionally, performance is better (you can get more frames) compared to your approach with PhotoCapture . You will get each frame in a callback function.

As this callback does not run on the Unity Main Thread, you will need to utilize some kind of job system to then call your Unity functions. I use this handy script: https://github.com/PimDeWitte/UnityMainThreadDispatcher

If this helps and you got further questions, i can add more details on how to set this up.

I recommend that you use the MediaCapture class asynchronously for video stream capture. In the actual example, the HoloLensCamera class in Microsoft SpectatorView uses the MediaCapture class to access video stream from the HoloLens camera. On line 861 , it declares an instance of the MediaCapture class and asynchronously obtains the video frame from the camera in the next code.

        Texture2D targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);

        ...

        quad.GetComponent<Renderer>().material.mainTexture = imageTexture; 

You can only create and assign textures on the main thread in Unity. The correct and only solution is to investigate WebCamTexture . You probably ran into a permissions issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM