[英]Swift 3 - Custom camera view - display still image of photo taken after button click
I am using Swift 3, Xcode 8.2. 我使用的是Swift 3,Xcode 8.2。
I have a custom camera view which displays the video feed fine and a button that I want to act as a shutter. 我有一个自定义摄像头视图,可以显示视频输入正常,还有一个我想用作快门的按钮。 When the user taps on the button, I want a picture taken and it to be displayed on the screen. 当用户点击按钮时,我想要拍摄一张照片并将其显示在屏幕上。 (eg like a Snapchat or Facebook Messenger style camera behavior) (例如像Snapchat或Facebook Messenger风格的相机行为)
Here is my code: 这是我的代码:
import UIKit
import AVFoundation
class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate {
// this is where the camera feed from the phone is going to be displayed
@IBOutlet var cameraView : UIView!
var shutterButton : UIButton = UIButton.init(type: .custom)
// manages capture activity and coordinates the flow of data from input devices to capture outputs.
var capture_session = AVCaptureSession()
// a capture output for use in workflows related to still photography.
var session_output = AVCapturePhotoOutput()
// preview layer that we will have on our view so users can see the photo we took
var preview_layer = AVCaptureVideoPreviewLayer()
// still picture image is what we show as the picture taken, frozen on the screen
var still_picture_image : UIImage!
... //more code in viewWillAppear that sets up the camera feed
// called when the shutter button is pressed
func shutterButtonPressed() {
// get the actual video feed and take a photo from that feed
session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: self as AVCapturePhotoCaptureDelegate)
}
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
// take the session output, get the buffer, and create an image from that buffer
if let sampleBuffer = photoSampleBuffer,
let previewBuffer = previewPhotoSampleBuffer,
let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print("Here") // doesn't get here
}
}
It doesn't seem to print "Here" when running this code and I can't find any Swift 3 tutorials on how to display this image. 在运行此代码时似乎没有打印“Here”,我找不到任何关于如何显示此图像的Swift 3教程。 I'm guessing I want to take the imageData
and assign it to my still_picture_image
and overlay that over the camera feed somehow. 我猜我想把imageData
分配给我的still_picture_image
并以某种方式叠加在相机上。
Any help or a point in the right direction would be great help. 任何帮助或正确方向上的一点都会有很大的帮助。
EDIT 编辑
After adding the following to my code: 将以下内容添加到我的代码后:
if let error = error {
print(error.localizedDescription)
}
But I still don't get any error printed. 但我仍然没有打印任何错误。
Add the following code into your delegate method to print out the error being thrown: 将以下代码添加到委托方法中以打印出引发的错误:
if let error = error {
print(error.localizedDescription)
}
Once you get your error resolved, I think this post should help you to extract the image: Taking photo with custom camera Swift 3 一旦你的错误得到解决,我认为这篇文章可以帮助你提取图像: 用自定义相机Swift 3拍照
Okay, I figured out my problem: 好的,我发现了我的问题:
First, drag an UIImageView to the Storyboard and have it take up the entire screen. 首先,将UIImageView拖到故事板上,让它占用整个屏幕。 This is where the still picture will be displayed after pressing the shutter button. 这是按下快门按钮后将显示静止图像的位置。
Create that variable in the code and link it. 在代码中创建该变量并链接它。
@IBOutlet weak var stillPicture : UIImageView!
Then, in viewDidLoad
make sure that you insert the UIImageView
on top of the camera view. 然后,在viewDidLoad
确保将UIImageView
插入摄像机视图的顶部。
self.view.insertSubview(stillPicture, aboveSubview: your_camera_view)
This is the function that is called when the shutter button is clicked: 这是单击快门按钮时调用的功能:
func shutterButtonPressed() {
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
session_output.capturePhoto(with: settings, delegate: self)
}
Then, in your capture delegate: 然后,在您的捕获委托中:
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print(error.localizedDescription)
}
// take the session output, get the buffer, and create an image from that buffer
if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
// this is the image that the user has taken!
let takenImage : UIImage = UIImage(data: dataImage)!
stillPicture?.image = takenImage
} else {
print("Error setting up photo capture")
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.