在自定义相机视图中捕获照片

在自定义相机视图中捕获照片

本文介绍了Swift 3-在自定义相机视图中捕获照片的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Swift 3,Xcode 8.2.

I am using Swift 3, Xcode 8.2.

我在这个主题上找到的许多教程似乎总是在Swift 2中.我已经创建了一个自定义相机视图,并且试图从中捕获照片.

A lot of the tutorials that I find on this topic seem to always be in Swift 2. I have a custom camera view already created and I am trying to capture a photo from it.

我在下面设置了@IBAction函数.变量session_outputAVCapturePhotoOutput:

I've set up an @IBAction function below. The variable session_output is AVCapturePhotoOutput:

@IBAction func shutterButtonPressed() {
    session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: <#T##AVCapturePhotoCaptureDelegate#>)
}

我不知道要在delegate字段中放入什么以及捕获后如何从缓冲区读取照片.在这种情况下,Swift 2和3之间的区别是如此明显,以至于我什至都无法理解它,在遵循大多数Swift 2教程时,我已经相当成功地做到了.

I don't know what to put in the delegate field and how to read the photo from the buffer after it is captured. The difference between Swift 2 and 3 is so stark in this case that I can't even bumble my way through it which I've been fairly successful at doing when following most Swift 2 tutorials.

任何帮助将不胜感激.

推荐答案

将委托设置为self并使用该委托AVCapturePhotoCaptureDelegate

set delegate to self and use this delegate AVCapturePhotoCaptureDelegate

您可以从下面的代表那里获取捕获的图像

And you can get captured image from below delegate

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {

    if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
      print(image: UIImage(data: dataImage).size)
    }

}

这篇关于Swift 3-在自定义相机视图中捕获照片的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-14 13:44