AVCaptureMovieFileOutput

AVCaptureMovieFileOutput

本文介绍了iOS:__connection_block_invoke_2中的错误:连接中断的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

控制台中的Xcode / iOS 8 / AVFoundation相关错误:

Xcode/iOS 8/AVFoundation related error in console:

error in __connection_block_invoke_2: Connection interrupted

我只是将AVCaptureVideoDataOutput添加到Apple的示例应用程序'AVCamManualUsingtheManualCaptureAPI'

I am just adding AVCaptureVideoDataOutput to Apple's sample app 'AVCamManualUsingtheManualCaptureAPI'

我添加的是:

    // CoreImage wants BGRA pixel format
    NSDictionary *outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInteger:kCVPixelFormatType_32BGRA]};

    // create and configure video data output
    AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    videoDataOutput.videoSettings = outputSettings;
    videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
    [videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];

上面插入示例项目的代码段:

Above snippet inserted in to sample project:

- (void)viewDidLoad
{
    [super viewDidLoad];

    self.view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;

    self.recordButton.layer.cornerRadius = self.stillButton.layer.cornerRadius = self.cameraButton.layer.cornerRadius = 4;
    self.recordButton.clipsToBounds = self.stillButton.clipsToBounds = self.cameraButton.clipsToBounds = YES;

    // Create the AVCaptureSession
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    [self setSession:session];

    // Set up preview
    [[self previewView] setSession:session];

    // Check for device authorization
    [self checkDeviceAuthorizationStatus];

    // In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
    // Why not do all of this on the main queue?
    // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).

    dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
    [self setSessionQueue:sessionQueue];

    dispatch_async(sessionQueue, ^{
        [self setBackgroundRecordingID:UIBackgroundTaskInvalid];

        NSError *error = nil;

        AVCaptureDevice *videoDevice = [AAPLCameraViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
        AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

        if (error)
        {
            NSLog(@"%@", error);
        }

        [[self session] beginConfiguration];

        if ([session canAddInput:videoDeviceInput])
        {
            [session addInput:videoDeviceInput];
            [self setVideoDeviceInput:videoDeviceInput];
            [self setVideoDevice:videoDeviceInput.device];

            dispatch_async(dispatch_get_main_queue(), ^{
                // Why are we dispatching this to the main queue?
                // Because AVCaptureVideoPreviewLayer is the backing layer for our preview view and UIView can only be manipulated on main thread.
                // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.

                [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
            });
        }

        AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
        AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

        if (error)
        {
            NSLog(@"%@", error);
        }

        if ([session canAddInput:audioDeviceInput])
        {
            [session addInput:audioDeviceInput];
        }

        AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
        if ([session canAddOutput:movieFileOutput])
        {
            [session addOutput:movieFileOutput];
            AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
            if ([connection isVideoStabilizationSupported])
            {
                [connection setEnablesVideoStabilizationWhenAvailable:YES];
            }
            [self setMovieFileOutput:movieFileOutput];
        }

        AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        if ([session canAddOutput:stillImageOutput])
        {
            [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
            [session addOutput:stillImageOutput];
            [self setStillImageOutput:stillImageOutput];
        }
            // CoreImage wants BGRA pixel format
NSDictionary *outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInteger:kCVPixelFormatType_32BGRA]};

// create and configure video data output
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
videoDataOutput.videoSettings = outputSettings;
videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
[videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];

        [[self session] commitConfiguration];

        dispatch_async(dispatch_get_main_queue(), ^{
            [self configureManualHUD];
        });
    });

    self.manualHUDFocusView.hidden = YES;
    self.manualHUDExposureView.hidden = YES;
    self.manualHUDWhiteBalanceView.hidden = YES;
}

我在__connection_block_invoke_2中收到错误:连接中断了'还有......

I get 'error in __connection_block_invoke_2: Connection interrupted' and also...

-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

..委托永远不会被调用。

..delegate never gets called.

所有我我试图做的是修改他们的代码来做现场核心图像过滤器。

All I am trying to do is modify their code to do live Core Image Filters.

推荐答案

所以我解决了这个问题。

So I fixed the issue.

您不能同时使用 AVCaptureMovieFileOutput AVCaptureVideoDataOutput 。我拿出 AVCaptureMovieFileOutput ,然后调用 AVCaptureVideoDataOutput 委托。

You can't use AVCaptureMovieFileOutput and AVCaptureVideoDataOutput simultaneously. I took out AVCaptureMovieFileOutput, and the AVCaptureVideoDataOutput delegate is called.

这篇关于iOS:__connection_block_invoke_2中的错误:连接中断的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-01 00:10