本文介绍了如何使用 AVCapturePhotoOutput的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在研究使用自定义相机,最近我升级到 Xcode 8 beta 和 Swift 3.我最初拥有这个:

I have been working on using a custom camera, and I recently upgraded to Xcode 8 beta along with Swift 3. I originally had this:

var stillImageOutput: AVCaptureStillImageOutput?

但是,我现在收到警告:

However, I am now getting the warning:

'AVCaptureStillImageOutput' 在 iOS 10.0 中被弃用:改用 AVCapturePhotoOutput

由于这是相当新的,我没有看到太多关于此的信息.这是我当前的代码:

As this is fairly new, I have not seen much information on this. Here is my current code:

var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() {

    if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {

        videoConnection.videoOrientation = .portrait
        stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

            if sampleBuffer != nil {

                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                let dataProvider = CGDataProvider(data: imageData!)
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

                let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

            }

        })

    }

}

我曾尝试查看 AVCapturePhotoCaptureDelegate,但我不太确定如何使用它.有人知道如何使用这个吗?谢谢.

I have tried to look at AVCapturePhotoCaptureDelegate, but I am not quite sure how to use it. Does anybody know how to use this? Thanks.

推荐答案

更新到 Swift 4使用 AVCapturePhotoOutput 真的很容易.

您需要返回 CMSampleBufferAVCapturePhotoCaptureDelegate.

如果您告诉 AVCapturePhotoSettings 预览格式

You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

        let cameraOutput = AVCapturePhotoOutput()

        func capturePhoto() {

          let settings = AVCapturePhotoSettings()
          let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
          let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                               kCVPixelBufferWidthKey as String: 160,
                               kCVPixelBufferHeightKey as String: 160]
          settings.previewPhotoFormat = previewFormat
          self.cameraOutput.capturePhoto(with: settings, delegate: self)

        }

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
            if let error = error {
                print(error.localizedDescription)
            }

            if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
              print("image: (UIImage(data: dataImage)?.size)") // Your Image
            }
        }
    }

有关更多信息,请访问 https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

For more information visit https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

注意:您必须在拍照前将 AVCapturePhotoOutput 添加到 AVCaptureSession.所以类似:session.addOutput(output),然后:output.capturePhoto(with:settings, delegate:self) 谢谢@BigHeadCreations

Note: You have to add the AVCapturePhotoOutput to the AVCaptureSession before taking the picture. So something like: session.addOutput(output), and then: output.capturePhoto(with:settings, delegate:self) Thanks @BigHeadCreations

这篇关于如何使用 AVCapturePhotoOutput的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-23 19:15