简体   繁体   中英

Using FFMPEG in an OS X application

so I've been wanting to make a real-time live streaming application. Essentially, the application would send the microphone feed from an Xcode application to a website where it can be viewed in real-time. Is FFMPEG the best solution for this? How would I go about doing this? If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application?

To directly address your questions:

(1) Is FFMPEG the best solution for this?

It depends. When setting up a live streaming environment you will likely stumble over FFMPEG, VLC and gstreamer, which are the options you have to simply stream video/audio. Therefore, yes, FFMPEG can be used as part of the solution. Please look into the following question: DIY Video Streaming Server

(2) How would I go about doing this?

Your requirement is to make a live streaming application which sends the mic input onto the web. This includes the following steps:

(0) Your Xcode application will need to provide a method to start this process. You don't necessarily need to integrate a framework to achieve this.

(1) Streaming / Restreaming

Use FFMPEG or VLC to grab your device and stream it locally:

ffmpeg -i audioDevice -acodec libfaac -ar 44100 -ab 48k -f rtp rtp://host:port

(2) Segmenting for HTTP Live Streaming*

Use a segmenter such as: mediastreamsegmenter (Apple), livehttp (VLC) or segment (FFMPEG) to prepare your stream for web delivery:

vlc -vvv -I dummy <SOURCEADDRESS> --sout='#transcode{acodec=libfaac,ab=48}:std{access=livehttp{seglen=10,delsegs=false,numsegs=10,index=/path/to/your/index/prog_index.m3u8,index-url=YourUrl/fileSequence######.ts},mux=ts{use-key-frames},dst=/path/to/your/ts/files/fileSequence######.ts}'

*you could also simply use VLC to grab your audiodevice with qtsound (see this question ) and prepare it for streaming with livehttp.

(3) HTML 5 Delivery

Publish your stream

<audio>
<source src="YOUR_PATH/playlist.m3u8" />
</audio>

(3) If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application?

Either use an external wrapper framework to access FFMPEG functionality and consult the tutorials to work with these frameworks or you could also approach this by using NSTask to wrap your command line arguments in Objective-C and simply start those tasks from your application - as in this question .

Another way would be to use VLCKit , which offers VLC functionality in a framework for Objective-C ( VLCKit wiki ). However when tackling streaming challenges I prefer to work with the actual commands instead of pushing another layer of framework in between, which may be missing some options.


I hope this points you in the right directions. There are multiple ways to solve this. It's a broad question, therefore this broad approach to answer your question.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM