简体   繁体   English

使用 ARKit 读取 iOS 光传感器

[英]Reading iOS light sensor with ARKit

Is there a way to access the ambient light sensor of an iOS device, using the ARKit, without using the AR at all?有没有办法使用 ARKit 访问 iOS 设备的环境光传感器,而根本不使用 AR?

https://developer.apple.com/documentation/arkit/arlightestimate/2878308-ambientintensity https://developer.apple.com/documentation/arkit/arlightestimate/2878308-ambientintensity

In other words, can I access the value of "ambientIntensity" without creating an AR scene.换句话说,我是否可以在不创建 AR 场景的情况下访问“ambientIntensity”的值。

See the docs for ARLightEstimate.ambientIntensity :请参阅ARLightEstimate.ambientIntensity的文档

This value is based on the internal exposure compensation of the camera device该值基于相机设备的内部曝光补偿

In other words, if you want to use the device camera to estimate local lighting conditions and aren't otherwise using ARKit, you might be better off using the camera APIs .换句话说,如果您想使用设备摄像头来估计局部照明条件并且不使用 ARKit,那么使用摄像头 API可能会更好。 (For one thing, those APIs are available on all iOS 11 devices and several earlier iOS versions, rather than needing the steep OS/hardware requirements of ARKit.) (一方面,这些 API 可用于所有 iOS 11 设备和几个早期的 iO​​S 版本,而不需要 ARKit 对操作系统/硬件的苛刻要求。)

A quick tour of what you'll need to do there:快速浏览您需要在那里做什么:

  1. Set up an AVCaptureSession and choose the camera AVCaptureDevice that you want.设置一个AVCaptureSession并选择你想要的相机AVCaptureDevice You may or may not need to wire up a video/photo capture output (which in your case will be mostly unused).您可能需要也可能不需要连接视频/照片捕获输出(在您的情况下大部分未使用)。
  2. Start running the capture session.开始运行捕获会话。
  3. Use KVO to monitor the exposure, temperature, and/or white balance related properties on AVCaptureDevice .使用 KVO 监控AVCaptureDevice上的曝光、温度和/或白平衡相关属性。

You can find (older, ObjC) code covering all this (and a lot more, so you'll need to extract the parts that are relevant to you) in Apple's AVCamManual sample code .您可以在 Apple 的AVCamManual 示例代码中找到(较旧的 ObjC)代码,涵盖所有这些(以及更多内容,因此您需要提取与您相关的部分)。

You don't need an ARSCNView but you do need to have a running ARSession https://developer.apple.com/documentation/arkit/arsession你不需要ARSCNView但你需要有一个正在运行的ARSession https://developer.apple.com/documentation/arkit/arsession

Once you have that set up you can call currentFrame which will give you an ARFrame which has a lightEstimate property which contains the ambientIntensity estimate.完成设置后,您可以调用currentFrame ,它会为您提供一个ARFrame ,该ARFrame具有包含环境ambientIntensity估计的lightEstimate属性。

Yes, in the captureOutput function to override when adapting the Protocol AVCaptureVideoDataOutputSampleBufferDelegate是的,在适配协议 AVCaptureVideoDataOutputSampleBufferDelegate 时要覆盖的 captureOutput 函数中

override func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

        //Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: kCMAttachmentMode_ShouldPropagate)
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        if let light = exifData?[kCGImagePropertyExifBrightnessValue] as? NSNumber {
            print("Light \(light.floatValue)")
        } else {
            print("problem with light")
        }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM