简体   繁体   中英

Capture photo with AVFoundation in Swift

I'm building a live filter camera app in swift.

I'm changing from AVCaptureVideoDataOutput() to AVCaptureStillImageOutput() to perform the capturing photo function. After I changed, there is no preview view when I open the app.

The capture photo function is working, I can hear the capture sound "Ka" when I click the main group area. There is just no view here.

Here is my full code

import Foundation
import UIKit
import AVFoundation
import CoreMedia

let CIHueAdjust = "CIHueAdjust"
let CIHueAdjustFilter = CIFilter(name: "CIHueAdjust", withInputParameters: ["inputAngle" : 1.24])

let Filters = [CIHueAdjust: CIHueAdjustFilter]

let FilterNames = [String](Filters.keys).sort()

class LiveCamViewController :       UIViewController,AVCaptureVideoDataOutputSampleBufferDelegate, UIImagePickerControllerDelegate, UINavigationControllerDelegate{
let mainGroup = UIStackView()
let imageView = UIImageView(frame: CGRectZero)
let filtersControl = UISegmentedControl(items: FilterNames)
var videoOutput = AVCaptureStillImageOutput()

override func viewDidLoad()
{
    super.viewDidLoad()

    view.addSubview(mainGroup)
    mainGroup.axis = UILayoutConstraintAxis.Vertical
    mainGroup.distribution = UIStackViewDistribution.Fill

    mainGroup.addArrangedSubview(imageView)
    mainGroup.addArrangedSubview(filtersControl)
    mainGroup.addGestureRecognizer(UITapGestureRecognizer(target: self, action:#selector(LiveCamViewController.saveToCamera(_:))))

    imageView.contentMode = UIViewContentMode.ScaleAspectFit

    filtersControl.selectedSegmentIndex = 0

    let captureSession = AVCaptureSession()
    captureSession.sessionPreset = AVCaptureSessionPresetPhoto

    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)

    do
    {
        let input = try AVCaptureDeviceInput(device: backCamera)

        captureSession.addInput(input)
    }
    catch
    {
        print("can't access camera")
        return
    }

    //get captureOutput invoked
    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    view.layer.addSublayer(previewLayer)

    videoOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]

    if captureSession.canAddOutput(videoOutput)
    {
        captureSession.addOutput(videoOutput)
    }

    captureSession.startRunning()
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!)
{
    guard let filter = Filters[FilterNames[filtersControl.selectedSegmentIndex]] else
    {
        return
    }

    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

    filter!.setValue(cameraImage, forKey: kCIInputImageKey)

    let filteredImage = UIImage(CIImage: filter!.valueForKey(kCIOutputImageKey) as! CIImage!)
    let fixedImage = correctlyOrientedImage(filteredImage)

    dispatch_async(dispatch_get_main_queue())
    {
        self.imageView.image = fixedImage
    }

}

func correctlyOrientedImage(image: UIImage) -> UIImage {

    UIGraphicsBeginImageContextWithOptions(image.size, false, image.scale)
    image.drawInRect(CGRectMake(0, 0, image.size.width, image.size.height))
    let normalizedImage:UIImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    let imageRef: CGImageRef = normalizedImage.CGImage!
    let rotatedImage: UIImage = UIImage(CGImage: imageRef, scale: 1.0, orientation: .Right)

    return rotatedImage
}

override func viewDidLayoutSubviews()
{
    mainGroup.frame = CGRect(x: 37, y: 115, width: 301, height: 481)
}

func saveToCamera(sender: UITapGestureRecognizer) {

    videoOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]

    if let videoConnection = videoOutput.connectionWithMediaType(AVMediaTypeVideo) {
        videoOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
            (imageDataSampleBuffer, error) -> Void in
            let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
            UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData)!, nil, nil, nil)
        }
    }
}

}

Thanks.

If you want to capture photo or video from camera, then you should use UIImagePickerController instead AVFoundation . Check Applce documentation for more information about it.

Update :

Refer this link . It have too many examples of customize imagePickercontroller . and you can refer this and this answer of stackobverflow.

Hope this will help :)

The problem might be because you don't have a frame for the previewLayer .

Type this:

previewLayer.frame = self.view.bounds

And also explicitly set the video gravity: (I think this is optional)

previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill // Or choose some other option if you prefer

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM