我正在尝试构建一个应用程序,该应用程序将从摄像机捕获帧并使用OpenCV处理它们,然后再将这些文件以特定帧速率保存到设备中。

我目前遇到的问题是AVCaptureVideoDataOutputSampleBufferDelegate似乎不尊重AVCaptureDevice.activeVideoMinFrameDurationAVCaptureDevice.activeVideoMaxFrameDuration设置。

如上述设置所示,captureOutput的运行速度远远快于每秒2帧。

您是否知道有没有代表,如何实现这一目标?

ViewController:

override func viewDidLoad() {
    super.viewDidLoad()

}

override func viewDidAppear(animated: Bool) {
    setupCaptureSession()
}

func setupCaptureSession() {

    let session : AVCaptureSession = AVCaptureSession()
    session.sessionPreset = AVCaptureSessionPreset1280x720

    let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]

    for device in videoDevices {
        if device.position == AVCaptureDevicePosition.Back {
            let captureDevice : AVCaptureDevice = device

            do {
                try captureDevice.lockForConfiguration()
                captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
                captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
                captureDevice.unlockForConfiguration()

                let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

                if session.canAddInput(input) {
                    try session.addInput(input)
                }

                let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

                let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
                output.setSampleBufferDelegate(self, queue: dispatch_queue)

                session.addOutput(output)

                session.startRunning()

                let previewLayer = AVCaptureVideoPreviewLayer(session: session)
                previewLayer.connection.videoOrientation = .LandscapeRight

                let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
                previewLayer.backgroundColor = UIColor.blackColor().CGColor
                previewLayer.frame = previewBounds
                previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                self.imageView.layer.addSublayer(previewLayer)

                self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)

            } catch _ {

            }
            break
        }
    }

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}

最佳答案

所以我已经解决了问题。

AVCaptureDevice.h属性上方的activeVideoMinFrameDuration的注释部分中,它指出:



最后一个要点是引起我的问​​题,因此执行以下操作可以为我解决问题:

        do {

            let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

            if session.canAddInput(input) {
                try session.addInput(input)
            }

            try captureDevice.lockForConfiguration()
            captureDevice.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.unlockForConfiguration()

            let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

            let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
            output.setSampleBufferDelegate(self, queue: dispatch_queue)

            session.addOutput(output)

关于iOS Swift-AVCaptureSession-捕获符合帧速率的帧,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/34718833/

10-09 10:13