简体   繁体   中英

How to capture iOS camera and publish RTMP live stream?

I have completed an RTMP player on iOS, using FFmpeg to decode flv1 video and speex audio. Now I want to capture iOS camera and decode H.264 video and AAC audio, then publish video and audio stream to RTMP server, Red5 server as the player programe used before. I know that I should recompile FFmpeg, adding libx264 and libaacplus to support iOS video and audio decoding. But then how to publish RTMP live stream? Using RTMP_Write() ? RTMP_SendPacket() ? Please just tell me some thoughts or solutions, or it's very generous of you to show me some code. Thanks!

Reference: capture camera and publish video with librtmp

FFmpeg supports rtmp input and output both with an internal protocol ("rtmp") and from an external library ("librtmp"). The only reason I know of to choose the internal or librtmp version over the other is for specific server support -- ie one may work better for you than another for a given server.

In FFmpeg, RTMP video is muxed to flv and so long as your output path/uri begins with "rtmp://..." it should just work for you. Nothing is stopping you from using librtmp directly, of course -- but why bother?

Configuring your server to accept streams, and to know what endpoint to view the stream on, can be it's own little adventure.

(Disclaimer: I'm pretty much doing this right now, so I know it's possible and straightforward.)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM