[英]How do I retrieve iPhone camera or video images before the user presses the snapshot button?
I am looking to stream what the camera sees onto a series of OpenGL ES textures. 我希望将相机看到的内容流式传输到一系列OpenGL ES纹理上。 Getting them to display on the textures is not difficult, however, it isn't clear that the UIImagePickerController is able to grab images from the camera before the user takes a snapshot.
让它们显示在纹理上并不困难,但是,不清楚的是,UIImagePickerController是否能够在用户拍摄快照之前从相机获取图像。
Tagged with 3gs because of the new video capture API. 由于采用了新的视频捕获API,因此被标记为3gs。
This is not supported in the iPhone SDK. iPhone SDK不支持此功能。 While there are some hacks people have done (that involve scrapping the data off the texture the builtin in UI is displaying to the user) using them will probably result in your app getting rejected from the store.
尽管人们做了一些黑客(涉及从内置UI显示给用户的纹理中删除数据),但是使用它们可能会导致您的应用被商店拒绝。
The only supported way to get video data is to use the Apple UI and then ask for the resulting move after it is done recording. 获得视频数据的唯一受支持的方法是使用Apple UI,然后在录制完成后要求进行相应的移动。 If you need realtime video data you should file a bug with Apple explaining why.
如果您需要实时视频数据,则应向Apple提交错误说明原因。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.