[英]Raycasting - Convert touch on screen to point on sphere in OpenGL ES (Rajawali3D)
I have a sphere created using the Rajawali3D OpenGL ES library, with the camera placed inside the sphere at (0, 0, 0)
. 我有一个使用Rajawali3D OpenGL ES库创建的球体,相机放置在
(0, 0, 0)
的球体内。 The user can rotate this sphere on swipe. 用户可以在滑动时旋转此球体。
I want to get the 3D co-ordinates of the spot the user touches on the sphere 我想获取用户在球体上触摸的点的3D坐标
Currently I am using the Unproject
method to get the near and far planes, calculate vector direction and find the intersection point in the sphere.Here is the code 目前我正在使用
Unproject
方法获取近平面和远平面,计算向量方向并找到球体中的交点。
mNearPos4 = new double[4];
mFarPos4 = new double[4];
mNearPos = new Vector3();
mFarPos = new Vector3();
mNewPos = new Vector3();
// near plane
GLU.gluUnProject(x, getViewportHeight() - y, 0,
mViewMatrix.getDoubleValues(), 0,
mProjectionMatrix.getDoubleValues(), 0, mViewport, 0,
mNearPos4, 0);
// far plane
GLU.gluUnProject(x, getViewportHeight() - y, 1.0f,
mViewMatrix.getDoubleValues(), 0,
mProjectionMatrix.getDoubleValues(), 0, mViewport, 0,
mFarPos4, 0);
// transform 4D to 3D
mNearPos.setAll(mNearPos4[0] / mNearPos4[3], mNearPos4[1]
/ mNearPos4[3], mNearPos4[2] / mNearPos4[3]);
mFarPos.setAll(mFarPos4[0] / mFarPos4[3],
mFarPos4[1] / mFarPos4[3], mFarPos4[2] / mFarPos4[3]);
Vector3 dir = new Vector3(mFarPos.x - mNearPos.x, mFarPos.y - mNearPos.y, mFarPos.z - mNearPos.z);
dir.normalize();
// compute the intersection with the sphere centered at (0, 0, 0)
double a = Math.pow(dir.x, 2) + Math.pow(dir.y, 2) + Math.pow(dir.z, 2);
double b = 2 * (dir.x * (mNearPos.x) + dir.y * (mNearPos.y) + dir.z * (mNearPos.z));
double c = Math.pow(mNearPos.x, 2) + Math.pow(mNearPos.y, 2) + Math.pow(mNearPos.z, 2) - radSquare;
double D = Math.pow(b, 2) - 4 * a * c;
// need only smaller root since the camera is within
// mNewPos is used as the position of the point
mNewPos.setAll((mNearPos.x + dir.x * t), (mNearPos.y + dir.y * t), mNearPos.z);
The problem is that i am getting the same range of co-ordinates when i rotate the sphere. 问题是,当我旋转球体时,我得到的坐标范围相同。 For example, If i get the co-ordinates
(a, b, c)
on one side of the sphere, i get the same on the opposite side of the sphere. 例如,如果我在球体的一侧获得坐标
(a, b, c)
,则在球体的另一侧获得相同的坐标。
How do i solve this problem and get the correct co-ordinates for all sides? 我该如何解决此问题并获得各方的正确坐标?
I am using Rajawali 1.0.232 snapshot 我正在使用Rajawali 1.0.232快照
SOLVED : The problem was that i was saving the camera's projection and view matrices in variables. 已解决 :问题是我将摄像机的投影和视图矩阵保存在变量中。
So when a call was made to unproject() to convert the 2D point to 3D, it was taking the old values and hence the point was not getting plotted correctly. 因此,当调用unproject()将2D点转换为3D时,它使用的是旧值,因此无法正确绘制该点。
So a solution would be to get the camera's view and projection matrices on demand without caching them. 因此,一种解决方案是按需获取相机的视图和投影矩阵,而不进行缓存。
mViewport = new int[]{0, 0, getViewportWidth(), getViewportHeight()};
Vector3 position3D = new Vector3();
mapToSphere(event.getX(), event.getY(), position3D, mViewport,
mCam.getViewMatrix(), mCam.getProjectionMatrix());
where the mapSphere() function does the unproject function as follows 其中mapSphere()函数执行unproject函数,如下所示
public static void mapToSphere(float x, float y, Vector3 position, int[] viewport,
Matrix4 viewMatrix, Matrix4 projectionMatrix) {
//please refer for explanation in case of openGL
//http://myweb.lmu.edu/dondi/share/cg/unproject-explained.pdf
double[] tempPosition = new double[4];
GLU.gluUnProject(x, viewport[3] - y, 0.7f,
viewMatrix.getDoubleValues(), 0,
projectionMatrix.getDoubleValues(), 0, viewport, 0,
tempPosition, 0);
// the co-ordinates are stored in tempPosition as 4d (x, y, z, w)
// convert to 3D by dividing x, y, z by w
// the minus (-) for the z co-ordinate worked for me
position.setAll(tempPosition[0] / tempPosition[3], tempPosition[1]
/ tempPosition[3], -tempPosition[2] / tempPosition[3]);
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.