简体   繁体   中英

How to display MJPEG stream transmitted via UDP in a Mac OS-X application

I have a camera that sends mjpeg frames as UDP packets over wifi that I would like to display in my max os-x application. My application is written in objective-c and I am trying to use the AVFoundation classes to display the live stream. The camera is controlled using http get & post requests.

I would like the camera to be recognized as a AVCaptureDevice as I can easily display streams from different AVCaptureDevices. Since the stream is over wifi, it isn't recognized as a AVCaptureDevice.

Is there a way I can create my own AVCaptureDevice that I can use to control this camera and display the video stream?

After much research into the packets sent from the camera, I have concluded that it does not communicate in any standard protocol such as RTP. What I ended up doing is reverse-engineering the packets to learn more about their contents.

I confirmed it does send jpeg images over UDP packets. It takes multiple UDP packets to send a single jpeg. I listened on the UDP port for packets and assemble them into a single image frame. After I have a frame, I then created an NSImage from it and displayed it in an NSImageView. Works quite nicely.

If anyone is interested, the camera is an Olympus TG-4. I am writing the components to control settings, shutter, etc.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM