简体   繁体   English

iOS影片播放

[英]iOS video playback

In my application i should play video in unusual way. 在我的应用程序中,我应该以不寻常的方式播放视频。 Something like interactive player for special purposes. 诸如用于特殊目的的交互式播放器之类的东西。

Main issues here: 这里的主要问题:

  • video resolution can be from 200*200px up to 1024*1024 px 视频分辨率可以从200 * 200px到1024 * 1024 px
  • i should have ability to change speed from -60 FPS to 60 PFS (in this case video should be played slower or faster depending on selected speed, negative means that video should play in back direction) 我应该有能力将速度从-60 FPS更改为60 PFS(在这种情况下,视频的播放速度应视所选速度而变慢或变快,负数表示视频应向后播放)
  • i should draw lines and objects over the video and scale it with image. 我应该在视频上绘制线条和对象,并用图像缩放它。
  • i should have ability Zoom image and pan it if its content more than screen size 我应该具有“缩放图像”并将其平移(如果其内容大于屏幕尺寸)的能力
  • i should have ability to change brightness, contrast and invert colors of this video 我应该有能力更改此视频的亮度,对比度和反转颜色

Now im doing next thing: 现在我正在做下一件事:

  • I splited my video to JPG frames 我将视频分割为JPG帧
  • created timer for N times per seconds (play speed control) 每秒创建N次计时器(播放速度控制)
  • each timer tick im drawing new texture (next JPG frame) with OpenGL 每个计时器都使用OpenGL在即时贴上绘制新纹理(下一个JPG帧)
  • for zoom and pan im playing with OpenGL ES transformations (translate, scale) 用于与OpenGL ES转换(转换,缩放)一起播放的缩放和平移

All looks fine until i use 320*240 px, but if i use 512*512px my play rate is going down. 在我使用320 * 240像素之前,一切看起来都很好,但是如果我使用512 * 512px,我的播放率就会下降。 Maybe timer behavour problem, maybe OpenGL. 也许定时器问题,也许是OpenGL。 Sometimes, if im trying to open big textures with high play rate (more than 10-15 FPS), application just crash with memory warnings. 有时,如果我试图以较高的播放速率(大于10-15 FPS)打开大纹理,则应用程序会因内存警告而崩溃。

What is the best practice to solve this issue? 解决此问题的最佳实践是什么? What direction should i dig? 我应该挖掘哪个方向? Maybe cocos2d or other game engines helps me? 也许cocos2d或其他游戏引擎可以帮助我吗? Mb JPG is not best solution for textures and i should use PNG or PVR or smth else? Mb JPG不是纹理的最佳解决方案,我应该使用PNG或PVR还是其他?

Keep the video data as a video and use AVAssetReader to get the raw frames. 将视频数据保留为视频,并使用AVAssetReader获取原始帧。 Use kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange as the colorspace, and do YUV->RGB colorspace conversion in GLES. 使用kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange作为色彩空间,并在GLES中进行YUV-> RGB色彩空间转换。 It will mean keeping less data in memory, and make much of your image processing somewhat simpler (since you'll be working with luma and chroma data rather than RGB values). 这将意味着在内存中保留更少的数据,并使您的图像处理变得更加简单(因为您将使用亮度和色度数据而不是RGB值)。

You don't need to bother with Cocos 2d or any game engine for this. 您无需为此而烦恼Cocos 2d或任何游戏引擎。 I strongly recommend doing a little bit of experimenting with OpenGL ES 2.0 and shaders. 我强烈建议您对OpenGL ES 2.0和着色器进行一些实验。 Using OpenGL for video is very simple and straightforward, adding a game engine to the mix is unnecessary overhead and abstraction. 将OpenGL用于视频非常简单明了,将游戏引擎添加到混合中是不必要的开销和抽象。

When you upload image data to the textures, do not create a new texture every frame. 将图像数据上载到纹理时,请勿在每一帧都创建一个新纹理。 Instead, create two textures: one for luma, and one for chroma data, and simply reuse those textures every frame. 相反,创建两个纹理:一个用于亮度,一个用于色度数据,然后在每个帧中简单地重复使用这些纹理。 I suspect your memory issues are arising from using many images and new textures every frame and probably not deleting old textures. 我怀疑您的内存问题是由每帧使用许多图像和新纹理引起的,并且可能没有删除旧纹理。

JPEG frames will be incredibly expensive to uncompress. JPEG帧解压缩将非常昂贵。 First step: use PNG. 第一步:使用PNG。

But wait! 可是等等! There's more. 还有更多。

Cocos2D could help you mostly through its great support for sprite sheets . Cocos2D可以通过对Sprite Sheet的强大支持而在很大程度上为您提供帮助。

The biggest help, however, may come from packed textures a la TexturePacker . 但是,最大的帮助可能来自TexturePacker的打包纹理。 Using PVR.CCZ compression can speed things up by insane amounts, enough for you to get better frame rates at bigger video sizes. 使用PVR.CCZ压缩可以加快处理速度,足以让您在更大的视频尺寸下获得更好的帧速率。

Vlad, the short answer is that you will likely never be able to get all of these features you have listed working at the same time. 弗拉德,简短的答案是,您可能永远无法同时获得列出的所有这些功能。 Playing video 1024 x 1024 video at 60 FPS is really going to be a stretch, I highly doubt that iOS hardware is going to be able to keep up with those kind of data transfer rates at 60FPS. 以60 FPS的速度播放1024 x 1024视频确实很费力,我非常怀疑iOS硬件是否能够跟上60 FPS的那种数据传输速率。 Even the h.264 hardware on the device can only do 30FPS at 1080p. 甚至设备上的h.264硬件也只能在1080p时达到30FPS。 It might be possible, but to then layer graphics rendering over the video and also expect to be able to edit the brightness/contrast at the same time, it is just too many things at the same time. 可能是可以的,但是要在视频上叠加图形渲染并期望能够同时编辑亮度/对比度,这是太多时间了。

You should focus in on what is actually possible instead of attempting to do every feature. 您应该专注于实际可能发生的事情,而不是尝试执行所有功能。 If you want to see an example Xcode app that pushes iPad hardware right to the limits, please have a look at my Fireworks example project. 如果要查看将iPad硬件推向极限的Xcode应用示例,请查看我的Fireworks示例项目。 This code displays multiple already decoded h.264 videos on screen at the same time. 此代码可同时在屏幕上显示多个已解码的h.264视频。 The implementation is built around CoreGraphics APIs, but the key thing is that Apple's impl of texture uploading to OpenGL is very fast because of a zero copy optimization. 该实现是基于CoreGraphics API构建的,但是关键是由于零拷贝优化,Apple将纹理上传到OpenGL的速度非常快。 With this approach, a lot of video can be streamed to the device. 通过这种方法,可以将大量视频流式传输到设备。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM