简体   繁体   English

在OS X应用程序中使用FFMPEG

[英]Using FFMPEG in an OS X application

so I've been wanting to make a real-time live streaming application. 所以我一直想制作一个实时直播应用程序。 Essentially, the application would send the microphone feed from an Xcode application to a website where it can be viewed in real-time. 本质上,该应用程序会将麦克风源从Xcode应用程序发送到网站,以便可以实时查看它。 Is FFMPEG the best solution for this? FFMPEG是否是最佳解决方案? How would I go about doing this? 我将如何去做呢? If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application? 如果范围太广,那么如何在OS X Objective-c应用程序中使用FFMPEG框架?

To directly address your questions: 要直接解决您的问题:

(1) Is FFMPEG the best solution for this? (1)FFMPEG是否是最佳解决方案?

It depends. 这取决于。 When setting up a live streaming environment you will likely stumble over FFMPEG, VLC and gstreamer, which are the options you have to simply stream video/audio. 在设置实时流环境时,您可能会发现FFMPEG,VLC和gstreamer,这是您仅需流视频/音频的选项。 Therefore, yes, FFMPEG can be used as part of the solution. 因此,可以,FFMPEG可以用作解决方案的一部分。 Please look into the following question: DIY Video Streaming Server 请调查以下问题: DIY视频流服务器

(2) How would I go about doing this? (2)我将如何去做?

Your requirement is to make a live streaming application which sends the mic input onto the web. 您的要求是制作一个实时流式传输应用程序,以将麦克风输入发送到网络上。 This includes the following steps: 这包括以下步骤:

(0) Your Xcode application will need to provide a method to start this process. (0)您的Xcode应用程序将需要提供一种方法来启动此过程。 You don't necessarily need to integrate a framework to achieve this. 您不一定需要集成框架来实现这一目标。

(1) Streaming / Restreaming (1)串流/串流

Use FFMPEG or VLC to grab your device and stream it locally: 使用FFMPEG或VLC来抓取设备并在本地流式传输:

ffmpeg -i audioDevice -acodec libfaac -ar 44100 -ab 48k -f rtp rtp://host:port

(2) Segmenting for HTTP Live Streaming* (2)用于HTTP实时流的分段*

Use a segmenter such as: mediastreamsegmenter (Apple), livehttp (VLC) or segment (FFMPEG) to prepare your stream for web delivery: 使用分段器,例如: mediastreamsegmenter (Apple), livehttp (VLC)或分段 (FFMPEG)为网络交付准备流:

vlc -vvv -I dummy <SOURCEADDRESS> --sout='#transcode{acodec=libfaac,ab=48}:std{access=livehttp{seglen=10,delsegs=false,numsegs=10,index=/path/to/your/index/prog_index.m3u8,index-url=YourUrl/fileSequence######.ts},mux=ts{use-key-frames},dst=/path/to/your/ts/files/fileSequence######.ts}'

*you could also simply use VLC to grab your audiodevice with qtsound (see this question ) and prepare it for streaming with livehttp. *您还可以简单地使用VLC来使用qtsound捕获音频设备(请参阅此问题 ),并准备使用livehttp进行流传输。

(3) HTML 5 Delivery (3)HTML 5交付

Publish your stream 发布您的信息流

<audio>
<source src="YOUR_PATH/playlist.m3u8" />
</audio>

(3) If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application? (3)如果范围太广,那么如何在OS X Objective-c应用程序中使用FFMPEG框架?

Either use an external wrapper framework to access FFMPEG functionality and consult the tutorials to work with these frameworks or you could also approach this by using NSTask to wrap your command line arguments in Objective-C and simply start those tasks from your application - as in this question . 使用外部包装器框架访问FFMPEG功能并查阅教程以使用这些框架,或者您也可以通过使用NSTask将命令行参数包装在Objective-C中并从应用程序中启动这些任务来实现这一点- 问题

Another way would be to use VLCKit , which offers VLC functionality in a framework for Objective-C ( VLCKit wiki ). 另一种方法是使用VLCKit ,该工具在Objective-C框架( VLCKit wiki )中提供VLC功能。 However when tackling streaming challenges I prefer to work with the actual commands instead of pushing another layer of framework in between, which may be missing some options. 但是,在解决流式传输挑战时,我更喜欢使用实际的命令,而不是在两者之间添加另一层框架,因为这可能会缺少某些选项。


I hope this points you in the right directions. 我希望这能为您指明正确的方向。 There are multiple ways to solve this. 有多种解决方法。 It's a broad question, therefore this broad approach to answer your question. 这是一个广泛的问题,因此,这种广泛的方法可以回答您的问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM