简体   繁体   English

iOS 摄像头预览不稳定

[英]iOS camera preview is unstable

Apple provides a sample iOS project called AVCamFilter where they use an MTKView to render the camera preview to the screen. Apple 提供了一个名为AVCamFilter的示例 iOS 项目,他们使用MTKView将相机预览渲染到屏幕上。 The problem is that the frame duration - the amount of time each individual camera frame spends being displayed on the screen - is not stable.问题在于帧持续时间 - 每个单独的相机帧在屏幕上显示的时间量 - 不稳定。 The camera stream is running at 30 FPS, that is one frame delivered every 33.3 milliseconds, so it stands to reason that we should be able to display these frames one after another with a frame duration of 33.3 ms.摄像头 stream 以 30 FPS 运行,即每 33.3 毫秒传送一帧,因此按道理我们应该能够以 33.3 毫秒的帧持续时间依次显示这些帧。 In reality the frame duration is unstable - most of the time it is about 33.3 ms but sometimes it is roughly 16.7 or 50.1 ms.实际上,帧持续时间是不稳定的——大部分时间约为 33.3 毫秒,但有时约为 16.7 或 50.1 毫秒。

According to documentation the default behavior for MTKView is to draw to the screen at 60 FPS, that is one frame every 16.7 ms, so each camera frame would be drawn twice before it is replaced by the next one.根据文档,MTKView 的默认行为是以 60 FPS 的速度绘制到屏幕上,即每 16.7 毫秒一帧,因此每个摄像机帧在被下一帧替换之前将被绘制两次。 The fact that MTKView's draw loop and the camera's capture loop are not synchronized explains the problem - the time elapsed between a draw and a capture will gradually drift causing frames to go from being captured just before a draw to just after a draw and so on and so forth resulting in some frames being displayed for three draws or one draw instead of the expected two. MTKView 的绘制循环和相机的捕捉循环不同步的事实解释了这个问题 - 绘制和捕捉之间经过的时间会逐渐漂移,导致 go 的帧从在绘制之前被捕捉到在绘制之后等等,并且依此类推,导致某些帧被显示为三张或一张而不是预期的两次。 The MTKView draw loop can be synchronized with the camera by disabling the default behavior and manually calling draw whenever a new camera frame arrives, but all of this is still not synchronized with the device display itself which is rendering at 60 FPS in its own loop. MTKView 绘图循环可以通过禁用默认行为并在新的相机帧到达时手动调用绘图来与相机同步,但是所有这些仍然不与设备显示本身同步,设备显示本身在其自己的循环中以 60 FPS 渲染。 So, we still have the same fundamental timing problem.所以,我们仍然有同样的基本时间问题。 I know CADisplayLink is meant for synchronizing things to the display but there is no way for the capture stream to use it.我知道CADisplayLink用于将内容同步到显示器,但捕获 stream 无法使用它。

So how do we render the camera preview with a stable 33.3 ms frame duration?那么我们如何以稳定的 33.3 ms 帧持续时间渲染相机预览呢? Is the instability an expected behavior with no workaround?不稳定性是没有解决方法的预期行为吗?

If I understand your intention correctly, you expect the camera ( AVCaptureSession ) to produce sample buffer at rate of 30fps and expect MTKView to consume these images at exactly the same rate of 30fps to render them to the screen.如果我正确理解您的意图,您期望相机 ( AVCaptureSession ) 以 30fps 的速率生成样本缓冲区,并期望MTKView以完全相同的 30fps 速率使用这些图像以将它们呈现到屏幕上。

MTKView will not guarantee that drawInMTKView will always be called at 30 fps. MTKView 不保证drawInMTKView将始终以 30 fps 调用。 For various reasons, the callback to drawInMTKView may incidentally be skipped.由于各种原因, drawInMTKView的回调可能会被偶然跳过。 The Same is true for AVCaptureSession . AVCaptureSession也是如此。

If you want to run draw calls of MTKView at specific framerate, why not try preferredFramesPerSecond property?如果你想以特定的帧率运行MTKView的绘制调用,为什么不试试preferredFramesPerSecond属性呢?

As per apple doc:根据苹果文档:

When your application sets its preferred frame rate, the view chooses a frame rate as close to that as possible based on the capabilities of the screen the view is displayed on.当您的应用程序设置其首选帧速率时,视图会根据显示视图的屏幕功能选择尽可能接近的帧速率。 To provide a consistent frame rate, the actual frame rate chosen is usually a factor of the maximum refresh rate of the screen.为了提供一致的帧率,实际选择的帧率通常是屏幕最大刷新率的一个因素。 For example, if the maximum refresh rate of the screen is 60 frames per second, that's also the highest frame rate the view sets as the actual frame rate.例如,如果屏幕的最大刷新率为每秒 60 帧,这也是视图设置为实际帧率的最高帧率。 However, if you ask for a lower frame rate, the view might choose 30, 20, or 15 frames per second, or another factor, as the actual frame rate.但是,如果您要求较低的帧速率,视图可能会选择每秒 30、20 或 15 帧或其他因素作为实际帧速率。 Your application should choose a frame rate that it can consistently maintain.您的应用程序应选择可以始终保持的帧速率。 The default value is 60 frames per second.默认值为每秒 60 帧。

This does solve the issue of sync as per my observation when worked on live filters on camera feed.根据我的观察,这确实解决了在相机源上处理实时过滤器时的同步问题。

Also, I don't know if got your problem correctly, because in theory if you are using latest available frame from AVCapture session in your rendering cycle, no matter at what FPS it runs, you should not see any hiccups unless your rendering is not staying steadily at the same framerate, or if it goes below camera capture framerate.另外,我不知道您的问题是否正确,因为理论上如果您在渲染周期中使用来自AVCapture session 的最新可用帧,无论它以什么 FPS 运行,除非您的渲染不是稳定地保持在相同的帧速率,或者如果它低于相机捕获帧速率。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM