On OS X Yosemite, the screen of an iOS 8 device can be mirrored to the OS X machine and saved as a media file. This can be done manually using QuickTime Player, but I want to do the same programatically.
Reading the docs, the iOS 8 device should be exposed as a webcam.
Calling [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]
on my Mac returns an NSArray
with no elements.
How can I setup a mirroring session using iOS 8 and Yosemite? How can I detect the capture device for the mirroring session?
Quoting from p28 of the PDF transcript for WWDC 2014 session #508 “Camera Capture: Manual Controls” :
iOS devices are presented as CoreMedia IO “DAL” plug-ins
You must opt in to see iOS screen devices in your OS X app
CMIOObjectPropertyAddress prop = { kCMIOHardwarePropertyAllowScreenCaptureDevices, kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyElementMaster }; UInt32 allow = 1; CMIOObjectSetPropertyData( kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow );
Also, see my blog for CoreMediaIO capture sample to directly intercept the raw compressed payload sent out from the device
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.