简体   繁体   English

如何从充当服务器的iPhone流式传输视频?

[英]How do I stream video from iPhone acting as a server?

I'm working on an app for iOS, where one iPhone has to live stream its camera recordings to another iPhone (to keep things simple, both are in the same Wi-Fi network). 我正在开发适用于iOS的应用程序,其中一个iPhone必须将其摄像机录制的内容实时传输到另一部iPhone(为简单起见,两者都在同一个Wi-Fi网络中)。
The streaming should work without a physical interconnect (eg a server used for routing the stream to clients). 流应该在没有物理互连的情况下工作(例如,用于将流路由到客户端的服务器)。 In fact, the recording iPhone should be the server which serves the other iPhone (or more other iOS devices in the network) with the live stream. 实际上, 录制的iPhone应该是通过实时流为其他iPhone(或网络中更多其他iOS设备)提供服务的服务器

So, what I need to do is: 因此,我需要做的是:

  1. Get the live pictures from the camera 从相机获取实时图片
  2. Process this data if needed 必要时处理此数据
  3. Send frame by frame to the connected clients ( TCP? ) 逐帧发送到连接的客户端( TCP?
  4. Receive the frames on the client and display them in real time 在客户端上接收帧并实时显示它们

What I have and what I'm stuck with: 我拥有和坚持的东西:

  1. I have already solved problem 1. I use an AVCaptureSession which is constantly returning CMSampleBufferRef 's (found here ). 我已经解决了问题1.我使用了AVCaptureSession ,它不断返回CMSampleBufferRef的内容(在此处找到)。

  2. I'm not so sure yet what I need to do with the CMSampleBufferRef . 我不太确定我该如何处理CMSampleBufferRef I do know how to transform it into a CGImage or a UIImage (thanks to Benjamin Loulier's great blogpost 2 ), but I have no idea of what specifically I need to send and if I need to encode the frames somehow. 我确实知道如何将其转换为CGImageUIImage (这要归功于Benjamin Loulier的出色博客2 ),但是我不知道我具体需要发送什么以及是否需要对帧进行编码。
    As mentioned by @jab in the above linked answer ( this ) it is possible to write those samples to a file with one or more AVAssetWriter 's. 如@jab在上面的链接的答案( this )中所提到的,可以将这些样本写入具有一个或多个AVAssetWriter的文件中。 But then again he says those 5 sec video snippets are to be uploaded to a server which makes a streamable movie file out of them (and that movie can then be streamed to an iOS device by HTTP Live Streaming I suppose). 但话又说回来,他说那5秒钟的视频片段将被上载到服务器,该服务器将从中制作出可流式传输的电影文件(然后我可以通过HTTP Live Streaming将电影流式传输到iOS设备)。

  3. As I already indicated, my app (ie the video capturing "server" device) has one or multiple clients connected to it and needs to send the video frames in real time to them. 正如我已经指出的,我的应用程序(即视频捕获“服务器”设备)已连接一个或多个客户端,并且需要实时向他们发送视频帧。
    One idea which came to my mind is to use a simple TCP connection where the server sends every single frame in a serialised format to the connected clients in a loop. 我想到的一个想法是使用简单的TCP连接,其中服务器以循环的方式将序列化格式的每个单个帧发送给连接的客户端。 More specifically: when one buffered frame is successfully sent to the client, the server takes the most recent frame as the next one to be sent. 更具体地说:当一个缓冲帧成功发送到客户端时,服务器将最近的帧作为下一个要发送的帧。
    Now: is this the right thought how it should work? 现在:这是正确的想法吗? Or is there another protocol, which is much better suited for this kind of task? 还是有另一种协议更适合此类任务?
    Remember: I want to keep it simple (simple for me, ie, so that I don't need to study too many new programming aspects) and fast. 记住:我想保持简单(对我来说很简单,也就是说,我不需要研究太多新的编程方面)并且要快速。 I already know some things about TCP, I wrote servers and clients with it at school in C , so I'd prefer to apply the knowledge I have now to this project... 我已经了解了TCP的一些知识,我在学校用C编写了服务器和客户机,所以我更愿意将我现在所拥有的知识应用于该项目...

  4. Last but not least, the receiving client: 最后但并非最不重要的是,接收客户:
    I imagine, if I'm really going to use a TCP connection, that on the client-side I receive frame after frame from the server, cast the read byte package into the used format ( CMSampleBuffer , CGImage , UIImage ) and just display it on a CALayer or UIImageView , right? 我想,如果我真的要使用TCP连接,则在客户端从服务器接收一帧接一帧的帧,将读取的字节包转换为使用的格式( CMSampleBufferCGImageUIImage )并仅显示它在CALayerUIImageView ,对吗? The effect of a movie will be gotten by just constantly keeping updated that image view. 只需不断更新图像视图,即可获得电影的效果。

Please give me some ideas on how to reach this goal. 请给我一些有关如何实现此目标的想法。 It is very important, because it's part of my school-graduation project... Any sample code is also appreciated ;-) Or just refer me to another site, tutorial, Stackoverflow-answer, etc. 这是非常重要的,因为它是我学校毕业项目的一部分...任何示例代码也应得到赞赏;-)或直接将我引至其他站点,教程,Stackoverflow-answer等。

If you have any question to this, just leave a comment and I'll update the post. 如果对此有任何疑问,请发表评论,我将更新帖子。

  1. Sounds OK? 听起来还好吗?

  2. Video frames are really big. 视频帧真的很大。 You're going to have bandwidth problems streaming video from one device to another. 您将遇到带宽问题,将视频从一台设备流到另一台设备。 You can compress the frames as JPEG s using UIImageJPEGRepresentation from a UIImage , but that's computationally expensive on the "server", and still may not make them small enough to stream well. 您可以使用UIImage UIImageJPEGRepresentation将帧压缩为JPEG ,但这在“服务器”上的计算量UIImageJPEGRepresentation ,而且可能还不够小,无法很好地进行流传输。 You can also reduce your frame rate and/or resolution by dropping frames, downsampling the UIImage s, and fiddling with the settings in your AVCaptureSession . 您还可以通过丢帧,降低UIImage的采样率以及摆弄AVCaptureSession的设置来降低帧速率和/或分辨率。 Alternately, you can send small (5-second) videos, which are hardware-compressed on the server and much easier to handle in bandwidth, but will of course give you a 5-second lag in your stream. 或者,您可以发送小(5秒)的视频,这些视频在服务器上进行了硬件压缩,带宽处理起来更加容易,但当然会使您的视频流滞后5秒。

  3. If you can require iOS 7, I'd suggest trying MultipeerConnectivity.framework . 如果您需要iOS 7,建议您尝试MultipeerConnectivity.framework It's not terribly difficult to set up, and I believe it supports multiple clients. 设置并不是很困难,我相信它支持多个客户端。 Definitely use UDP rather than TCP if you're going to roll your own networking - this is a textbook application for UDP, and it has lower overhead. 如果要建立自己的网络,请绝对使用UDP而不是TCP-这是UDP的教科书应用程序,它的开销较低。

  4. Frame by frame, just turn the JPEG s into UIImage s and use UIImageView . 逐帧,只需将JPEG转换为UIImage并使用UIImageView There's significant computation involved, but I believe you'll still be limited by bandwidth rather than CPU. 其中涉及大量计算,但是我相信您仍然会受到带宽而不是CPU的限制。 If you're sending little videos, you can use MPMoviePlayerController . 如果要发送少量视频,则可以使用MPMoviePlayerController There will probably be little glitches between each video as it "prepares" them for playback, which will also result in requiring 5.5 seconds or so to play each 5-second video. 在每个视频“准备”播放时,它们之间可能几乎没有毛刺,这也将导致需要花费5.5秒左右的时间才能播放每个5秒的视频。 I wouldn't recommend using HTTP Live Streaming unless you can get a real server into the mix somewhere. 我不建议您使用HTTP Live Streaming,除非您可以在某个地方使用真正的服务器。 Or you could use an ffmpeg pipeline -- feed videos in and pop individual frames out -- if you can/want to compile ffmpeg for iOS. 或者,您可以使用ffmpeg管道-输入视频并弹出单个帧-如果您可以/希望为iOS编译ffmpeg

Let me know if you need clarification on any of these points. 让我知道您是否需要澄清以上几点。 It's a lot of work but relatively straightforward. 这是很多工作,但是相对简单。

If you need a solution off the shelf, you may try some ready streaming libraries. 如果您需要现成的解决方案,则可以尝试一些现成的流媒体库。 The one I have experience is angl streaming lib . 我经历过的一个是angl stream lib Works pretty well with RTMP output to Wowza media server. RTMP输出到Wowza媒体服务器的效果很好。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM