[英]OpenGL iOS screen coordinates to scene coordinates
I'm using OpenGl to make a little Application and I have no idea how to get the x , y
and z
coordinates of a touch 我正在使用OpenGl制作一个小应用程序,我不知道如何获取触摸的x , y
和z
坐标
For example: 例如:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self view]];
float X = touchPoint.x;
float Y = touchPoint.y;
}
with this I only get the x
and y
coordinates ,but I need the z
as well 这样我只能得到x
和y
坐标,但是我也需要z
You can't read the z buffer on OpenGL ES. 您无法在OpenGL ES上读取z缓冲区。 When I've needed to do this kind of 3D hit testing I've projected a 3d line through the scene and done the hit-testing myself. 当我需要进行这种3D命中测试时,我已经在场景中投射了一条3d线,并亲自进行了命中测试。
For OpenGL on Retina devices you need to multiply x and y by the scale. 对于Retina设备上的OpenGL,您需要将x和y乘以比例。 To support iPhone 6+ you need to use nativeScale when available. 要支持iPhone 6+,您需要在可用时使用nativeScale。
// Do this when setting up view controller and remember theScale.
if ([[UIScreen mainScreen] respondsToSelector:@selector(nativeScale)])
theScale = [[UIScreen mainScreen] nativeScale];
else if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)])
theScale = [[UIScreen mainScreen] scale];
else
theScale = 1.0;
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.