簡體   English   中英

未調用 AVCaptureVideoDataOutputSampleBufferDelegate 的 captureOutput

[英]captureOutput from AVCaptureVideoDataOutputSampleBufferDelegate is not being called

這似乎是一個相當普遍的問題,我已經鏈接到下面的類似帖子。 請不要將此問題標記為重復,因為我已仔細閱讀鏈接的帖子並花費大量時間實施作為答案或評論中提供的每個解決方案。

本文所述,我正在尋求將設備攝像頭用作光傳感器。 不幸的是, AVCaptureVideoDataOutputSampleBufferDelegate從未調用captureObject function 。 我在 SwiftUI 應用程序中嘗試這個可能是相關的,我沒有看到這個問題在 SwiftUI 應用程序的上下文中發布或解決。

class VideoStream: NSObject, ObservableObject, 
    AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        // request camera permissions and call beginCapture()
        ...
    }

    func beginCapture() {
        print("beginCapture entered") // prints
        session = AVCaptureSession()
        session.beginConfiguration()
        let videoDevice = bestDevice() // func def omitted for readability
        print("Device: \(videoDevice)") // prints a valid device
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: 
             "VideoQueue"))
        session.addOutput(videoOutput)
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
     }

    // From: https://stackoverflow.com/questions/41921326/how-to-get-light-value-from- 
       avfoundation/46842115#46842115
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, 
        from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        let FNumber : Double = exifData?["FNumber"] as! Double
        let ExposureTime : Double = exifData?["ExposureTime"] as! Double
        let ISOSpeedRatingsArray = exifData!["ISOSpeedRatings"] as? NSArray
        let ISOSpeedRatings : Double = ISOSpeedRatingsArray![0] as! Double
        let CalibrationConstant : Double = 50
        
        //Calculating the luminosity
        let luminosity : Double = (CalibrationConstant * FNumber * FNumber ) / ( ExposureTime * ISOSpeedRatings )
        luminosityReading = luminosity
    }
}

最后,我在ContentView中將VideoStream實例化為StatreObject並嘗試讀取更新后的luminosityReading

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
            .padding()
    }
}

重復一遍,我已經仔細閱讀並實施了這些類似帖子中描述的解決方案:

在沒有預覽的情況下使用 AVCaptureVideoDataOutputSampleBufferDelegate window

未調用 captureOutput

未從委托調用 captureOutput

AVCaptureAudioDataOutputSampleBufferDelegate 未調用 captureOutput

在 Swift 中,改編了 AVCaptureVideoDataOutputSampleBufferDelegate,但 captureOutput 從未被調用

AVCaptureVideoDataOutput captureOutput 未被調用

Swift - captureOutput 沒有被執行

為什么不調用 AVCaptureVideoDataOutputSampleBufferDelegate 方法

為什么從不調用 captureOutput?

func captureOutput 永遠不會被調用

captureOutput() function 從未被稱為 swift4

最小可重現示例:

import SwiftUI
import AVKit

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
    }
}

class VideoStream: NSObject, ObservableObject, AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        switch AVCaptureDevice.authorizationStatus(for: .video) {
        case .authorized: // The user has previously granted access to the camera.
            beginCapture()
        case .notDetermined: // The user has not yet been asked for camera access.
            AVCaptureDevice.requestAccess(for: .video) { granted in
                if granted {
                    self.beginCapture()
                }
            }
            
        case .denied: // The user has previously denied access.
            return
            
        case .restricted: // The user can't grant access due to restrictions.
            return
        }
    }

    func beginCapture() {
        
        print("beginCapture entered")
        
        let testDevice = AVCaptureDevice.default(for: .video)
        print("Image Capture Device: \(testDevice)")
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: testDevice!),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoQueue"))
        session.addOutput(videoOutput)
        
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
    }
    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // light meter logic to update luminosityReading
    }
}

您缺少添加輸入

if session.canAddInput(videoDeviceInput){
    session.addInput(videoDeviceInput)
}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM