简体   繁体   English

从 iPhone 上传实时流媒体视频,如 Ustream 或 Qik

[英]Upload live streaming video from iPhone like Ustream or Qik

How to live stream videos from iPhone to server like Ustream or Qik?如何将视频从 iPhone 直播到 Ustream 或 Qik 等服务器? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.我知道 Apple 有一种叫做 Http Live Streaming 的东西,但我发现的大多数资源都只讨论从服务器到 iPhone 的流媒体视频。

Is Apple's Http Living Streaming something I should use?我应该使用 Apple 的 Http Living Streaming 吗? Or something else?或者是其他东西? Thanks.谢谢。

There isn't a built-in way to do this, as far as I know.据我所知,没有内置的方法可以做到这一点。 As you say, HTTP Live Streaming is for downloads to the iPhone.正如您所说,HTTP Live Streaming 用于下载到 iPhone。

The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame.我这样做的方法是实现一个 AVCaptureSession,它有一个委托,带有一个在每一帧上运行的回调。 That callback sends each frame over the network to the server, which has a custom setup to receive it.该回调将每个帧通过网络发送到服务器,该服务器具有接收它的自定义设置。

Here's the flow: https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2这是流程: https : //developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

And here's some code:这是一些代码:

// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];

// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];

// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];

// go!
[captureSession startRunning];

Then the output device's delegate (here, self) has to implement the callback:然后输出设备的委托(这里是 self)必须实现回调:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

   NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}

EDIT/UPDATE编辑/更新

Several people have asked how to do this without sending the frames to the server one by one.有几个人问过如何在不将帧一一发送到服务器的情况下执行此操作。 The answer is complex...答案很复杂……

Basically, in the didOutputSampleBuffer function above, you add the samples into an AVAssetWriter .基本上,在上面的didOutputSampleBuffer函数中,您将样本添加到AVAssetWriter I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.实际上,我同时有 3 个资产编写者活跃——过去、现在和未来——在不同的线程上进行管理。

The past writer is in the process of closing the movie file and uploading it.过去的作者正在关闭电影文件并上传它。 The current writer is receiving the sample buffers from the camera.当前编写器正在从相机接收样本缓冲区。 The future writer is in the process of opening a new movie file and preparing it for data.未来的作者正在打开一个新的电影文件并准备数据。 Every 5 seconds, I set past=current; current=future每 5 秒,我设置past=current; current=future past=current; current=future and restart the sequence. past=current; current=future并重新启动序列。

This then uploads video in 5-second chunks to the server.然后将视频以 5 秒的块上传到服务器。 You can stitch the videos together with ffmpeg if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming.如果需要,您可以使用ffmpeg将视频拼接在一起,或者将它们转码为 MPEG-2 传输流以进行 HTTP Live Streaming。 The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.视频数据本身由资产编写器进行 H.264 编码,因此转码仅更改文件的标头格式。

I have found one library that will help you on this.我找到了一个可以帮助你解决这个问题的图书馆。

HaishinKit Streaming Library HaishinKit 流媒体库

Above Library is giving you all option streaming Via RTMP or HLS.以上库为您提供了通过 RTMP 或 HLS 进行流式传输的所有选项。

Just follow this library given step and read it all instruction carefully.只需按照此库给定的步骤并仔细阅读所有说明即可。 Please don't direct run example code given in this library it is having some error instead of that get required class and pod into your demo app.请不要直接运行此库中给出的示例代码,它会出现一些错误,而不是将所需的类和 pod 放入您的演示应用程序中。

I have just done it with this you can record screen, Camera and Audio.我刚刚完成了它,您可以录制屏幕,相机和音频。

I'm not sure you can do that with HTTP Live Streaming.我不确定您是否可以使用 HTTP Live Streaming 做到这一点。 HTTP Live Streaming segments the video in 10 secs (aprox.) length, and creates a playlist with those segments. HTTP Live Streaming 以 10 秒(大约)的长度对视频进行分段,并使用这些分段创建播放列表。 So if you want the iPhone to be the stream server side with HTTP Live Streaming, you will have to figure out a way to segment the video file and create the playlist.因此,如果您希望 iPhone 成为带有 HTTP Live Streaming 的流服务器端,您将必须想办法对视频文件进行分段并创建播放列表。

How to do it is beyond my knowledge.如何做到这一点超出了我的知识范围。 Sorry.对不起。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM