Raycasting selection is working fine for my project on static meshes, however for animated meshes the ray selection doesn't seem to see the movement of the mesh and only responds to the mesh's non-animated (original) position.
This is an animated model that can only pick up the pose state below the first frame and creates a small red dot when I detect the model The bones matrices are computed by the CPU, but the new vertex positions are computed by GPU. So the CPU has access to the first pose only. That's why RayCasting does not work (properly) for skinned meshes.
My idea is to update the model position when updating the animation, or use GPU calculations to get the location, but I don't know how to do it. I'm looking forward to your suggestions. Thank you.
Currently, raycasting in three.js supports morph targets (for THREE.Geometry
only) by replicating the vertex shader computations on the CPU.
So yes, in theory, you could add the same functionality to support raycasting for skinned meshes for both THREE.Geometry
and THREE.BufferGeometry
. However, a more efficient approach would be to use "GPU picking".
You can find an example of GPU picking in this three.js example . In the example, the objects are not animated, but the concept is the same.
three.js r.98
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.