本文介绍了具有多个输出的AVCaptureSession?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在开发一款iOS应用程序,将CoreImage应用于相机Feed以拍摄照片和视频,而且我遇到了一些麻烦。

I'm currently developing an iOS app that applies CoreImage to the camera feed in order to take photos and videos, and I've run into a bit of snag.

到目前为止,我一直在使用 AVCaptureVideoDataOutput 获取样本缓冲区并使用CoreImage对其进行操作,然后显示简单的预览,并使用它来捕获照片并保存它们。

Up till now I've been using AVCaptureVideoDataOutput to obtain the sample buffers and manipulate them with CoreImage, and then displayed a simple preview, as well as using it to capture photos and saving them.

当我尝试实施视频录制时,将我从 AVCaptureVideoDataOutput 收到的视频写入SampleBuffers,它的速度非常慢帧速率(可能是因为正在进行的其他图像相关处理)。

When I tried to implement Video Recording, by writing the SampleBuffers to a video as I received them from the AVCaptureVideoDataOutput, it had a very slow frame rate (probably because of the other image relating processing that was going on).

所以我想知道,是否有可能拥有 AVCaptureVideoDataOutput 和AVCaptureMoveFileOutput同时在同一个AVCaptureSession上进行?

So I was wondering, is it possible to have an AVCaptureVideoDataOutput and a AVCaptureMoveFileOutput going on the same AVCaptureSession simultaneously?

我快速给了它,发现当我添加额外输出时,我的AVCaptureVideoDataOutput停止接收信息。

I gave it a quick go, and found that when I added the extra output, my AVCaptureVideoDataOutput stopped receiving information.

如果我可以使它工作,我希望这意味着我可以简单地使用第二个输出以高帧率记录视频,并进行后期处理用户停止录制后的视频。

If I can get it working, I'm hoping it means that I can simply use the 2nd output to record video at high frame rates, and do post-processing on the video after the user has stopped recording.

我们将非常感谢任何帮助。

Any help will be greatly appreciated.

推荐答案

这比你想象的容易。

参见:


  1. 使用AVCaptureVideoDataOutput捕获数据。

  2. 在录制之前创建一个新的调度队列,例如。 recordingQueue: recordingQueue =
    dispatch_queue_create(电影录制队列,
    DISPATCH_QUEUE_SERIAL);

  3. 在captureOutput:didOutputSampleBuffer:fromConnection:委托
    方法,捕获samplebuffer,保留它,并在记录
    队列中,将其写入文件:

  1. Capture data using AVCaptureVideoDataOutput.
  2. Create a new dispatch queue before recording, eg. recordingQueue: recordingQueue =dispatch_queue_create("Movie Recording Queue",DISPATCH_QUEUE_SERIAL);
  3. In the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, capture the samplebuffer, retain it, and in the recording queue, write it to the file:

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);

    dispatch_async(recordingQueue, ^{

        if (assetWriter) {

            if (connection == videoConnection) {
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
            } else if (connection == audioConnection) {
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio];
            }

        }

        CFRelease(sampleBuffer);
    });
}

    - (void) writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType
    {
        CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

        if ( assetWriter.status == AVAssetWriterStatusUnknown ) {

            if ([assetWriter startWriting]) {
                [assetWriter startSessionAtSourceTime:presentationTime];
            } else {
                NSLog(@"Error writing initial buffer");
            }
        }

        if ( assetWriter.status == AVAssetWriterStatusWriting ) {

            if (mediaType == AVMediaTypeVideo) {
                if (assetWriterVideoIn.readyForMoreMediaData) {

                    if (![assetWriterVideoIn appendSampleBuffer:sampleBuffer]) {
                        NSLog(@"Error writing video buffer");
                    }
                }
            }
            else if (mediaType == AVMediaTypeAudio) {
                if (assetWriterAudioIn.readyForMoreMediaData) {

                    if (![assetWriterAudioIn appendSampleBuffer:sampleBuffer]) {
                        NSLog(@"Error writing audio buffer");
                    }
                }
            }
        }
    }


这篇关于具有多个输出的AVCaptureSession?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-30 17:31