简体   繁体   中英

Streaming jpegs live to iphone app

Im progressing through taking the depth camera feed from my kinect and streaming it to my iphone app. I have got to stage of being able save jpegs for every frame created from the kinect depth image (30 fps) and save them to the local disk. I have than been able to convert this to mpeg with ffmpeg.

My question is now how can I view this live on my iphone? Basically I want to view it live on the iphone as you are seeing it coming from the kinect.

Should I use http live streaming and use the segmenter to use apples HttpLiveStreaming functionality? Or can I stream the raw jpeg image files up in some way as they are saved to disk and than just cycle the images on the phone as they go?

Im wondering how video conferencing is achieved on the iphone (facetime/skype etc)? because I'd prefer that it wasnt played inside the video player, just want to display live content on the screen as it happens.

Any ideas? Thanks in advance

JPEGs are typically too large to stream in real time -- I've found it sticks around 5 fps on Wi-fi. If you take your mpeg outputs in small chunks (say, 5-10 sec of video in each chunk) and use ffmpeg to convert them to .ts containers (mpeg2 transport stream), then it's pretty easy to dynamically write an m3u8 index file that contains a list of the chunks in order. Point a UIWebView at the URL for the m3u8 file, and the stream will start playing, although it will use the built-in video player. You can use other media/AV classes to watch your stream, though, I believe.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM