简体   繁体   中英

ARCore – How to place a Real World object in front of a 3d model?

I have created an Android AR app using ARCore Sceneform plugin. I am able to place 3D models on the highlighted plane. But even if I'm placing a real object in front of the camera (or just waving my hands in front of the camera) it is visible only behind the 3d object.

How can I place real-world objects in front of the 3D objects in Android?

Can we use any other plugin to achieve this?

ARCore 1.18

ARCore 1.18 now does support Occlusion feature via brand-new Depth API. Check a list of Depth API supported devices to find out whether your Android smartphone supports Occlusion capabilities or not.

在此处输入图像描述

Google documentation says about Depth API the following:

The Depth API uses a depth-from-motion algorithm to create depth maps, which you can obtain using acquireDepthImage() method. This algorithm takes multiple device images from different angles and compares them to estimate the distance to every pixel as a user moves their phone. If the device has an active depth sensor, such as a time-of-flight sensor (or ToF sensor), that data is automatically included in the processed depth. This enhances the existing depth map and enables depth even when the camera is not moving. It also provides better depth on surfaces with few or no features, such as white walls, or in dynamic scenes with moving people or objects.


ARCore 1.17

But you can't do any Object Occlusion ops in previous ARCore 1.17. It doesn't support Depth channel compositing and Depth channel ops, except camera defocus feature. So all virtual models you use in your AR app is just placed over a real world video using Over compositing operation with the following formula:

Argb * Aa + Brgb * (1.0 – Aa)

//  where Argb is RGB of a foreground image
//  Aa is Alpha channel of a foreground image
//  Brgb is RGB of a background video
//  and 1.0 is a normalised value (in range 0.0...1.0)

//  (Argb * Aa) is a premultiplied RGBA image
//  (1.0 – Aa) is an inversion of a foreground alpha

However, you can get an Occlusion Management feature (Virtual Buttons sample) if you use PTC Vuforia 9.0 in Unity for Android devices. Or if you use ARKit 3.5 or ARKit 4.0 for iOS devices. ARKit 4.0 gives you the widest set of toolkits for working with Depth channel, including new Depth API . There are People Occlusion , Face Occlusion and Object Occlusion features.

In this post you can read about main principles of Occlusion in ARKit.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM