问题描述
我一直在使用自定义的相机,我最近升级到Xcode 8测试版和Swift 3.我原来有这样:
var stillImageOutput:AVCaptureStillImageOutput?但是,我现在得到的警告:
,我没有看到太多的信息。这是我当前的代码:
var captureSession:AVCaptureSession?
var stillImageOutput:AVCaptureStillImageOutput?
var previewLayer:AVCaptureVideoPreviewLayer?
func clickPicture(){
if let videoConnection = stillImageOutput?.connection(withMediaType:AVMediaTypeVideo){
videoConnection.videoOrientation = .portrait
stillImageOutput?.captureStillImageAsynchronously(from:videoConnection,completionHandler:{(sampleBuffer,error) - > Void in
if sampleBuffer!= nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data:imageData!)
让cgImageRef = CGImage(jpegDataProviderSource:dataProvider !, decode:nil,shouldInterpolate:true,intent:.defaultIntent)
let image = UIImage(cgImage:cgImageRef !, scale:1,orientation:.right)
}
})
$ b b}
}
我试着看看 AVCapturePhotoCaptureDelegate
,但我不太确定如何使用它。有人知道如何使用这个吗?谢谢。
解决方案 Hi很容易使用 AVCapturePhotoOutput
p>
您需要返回 CMSampleBuffer
的 AVCapturePhotoCaptureDelegate
p>
如果你告诉 AVCapturePhotoSettings
previewFormat
class CameraCaptureOutput:NSObject,AVCapturePhotoCaptureDelegate {
let cameraOutput = AVCapturePhotoOutput()
func capturePhoto b
$ b让settings = AVCapturePhotoSettings()
让previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String:previewPixelType,
kCVPixelBufferWidthKey as String:160,
kCVPixelBufferHeightKey as String:160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with:settings,delegate:self)
}
func capture(_ captureOutput:AVCapturePhotoOutput,didFinishProcessingPhotoSampleBuffer photoSampleBuffer:CMSampleBuffer ?, previewPhotoSampleBuffer:CMSampleBuffer ?, resolvedSettings: AVCaptureResolvedPhotoSettings,bracketSettings:AVCaptureBracketedStillImageSettings ?,错误:NSError?){
如果let error = error {
print(error.localizedDescription)
}
if let sampleBuffer = photoSampleBuffer,let previewBuffer = previewPhotoSampleBuffer,let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:sampleBuffer,previewPhotoSampleBuffer:previewBuffer){
print(image:UIImage(data:dataImage).size)
} else {
}
}
}
有关详细信息,请访问
I have been working on using a custom camera, and I recently upgraded to Xcode 8 beta along with Swift 3. I originally had this:
var stillImageOutput: AVCaptureStillImageOutput?
However, I am now getting the warning:
As this is fairly new, I have not seen much information on this. Here is my current code:
var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?
func clickPicture() {
if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {
videoConnection.videoOrientation = .portrait
stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data: imageData!)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)
}
})
}
}
I have tried to look at AVCapturePhotoCaptureDelegate
, but I am not quite sure how to use it. Does anybody know how to use this? Thanks.
Hi it's really easy to use AVCapturePhotoOutput
.
You need the AVCapturePhotoCaptureDelegate
which returns the CMSampleBuffer
.
You can get as well a preview image if you tell the AVCapturePhotoSettings
the previewFormat
class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {
let cameraOutput = AVCapturePhotoOutput()
func capturePhoto() {
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)
}
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {
if let error = error {
print(error.localizedDescription)
}
if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print(image: UIImage(data: dataImage).size)
} else {
}
}
}
For more information visit https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput
这篇关于如何使用AVCapturePhotoOutput的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!