我的应用旨在使用Xcode 10.0作为IDE记录视频并分析在iOS 11.4下生成的帧。使用AVCaptureMovieFileOutput成功录制视频,但是需要分析转换为AVAssetWriter的帧并在RosyWriter [https://github.com/WildDylan/appleSample/tree/master/RosyWriter]之后建模代码。代码是用ObjC编写的。

我陷入captureOutput:didOutputSampleBuffer:fromConnection:委托内部的问题。捕获第一帧后,将使用从第一帧提取的设置来配置AVAssetWriter及其输入(视频和音频)。用户选择记录后,将分析并写入捕获的sampleBuffer。我尝试使用AVAssetWriter startSessionAtSourceTime:但CMSampleBufferGetPresentationTimeStamp从示例缓冲区返回CMTime的方式显然存在问题。 sampleBuufer日志似乎显示带有有效值的CMTime。

如果我实施:
CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[自我-> assetWriter startSessionAtSourceTime:sampleTime]
生成的错误是'***-[AVAssetWriter startSessionAtSourceTime:]无效的参数,不满足:CMTIME_IS_NUMERIC(startTime)'。

如果我使用[self-> assetWriter startSessionAtSourceTime:kCMTimeZero]错误,则“警告:在执行过程中无法执行支持代码来读取Objective-C类数据。这可能会降低可用类型信息的质量。”生成。

当我记录sampleTime时,我读取-value = 0,timescale = 0,epoch = 0和flags = 0。我还记录了sampleBuffer并在下面显示,然后显示相关代码:

SampleBuffer Content =

2018-10-17 12:07:04.540816+0300 MyApp[10664:2111852] -[CameraCaptureManager captureOutput:didOutputSampleBuffer:fromConnection:] : sampleBuffer - CMSampleBuffer 0x100e388c0 retainCount: 1 allocator: 0x1c03a95e0
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
    Orientation(P) = 1
    {Exif}    (P) = <CFBasicHash 0x28161ce80 [0x1c03a95e0]>{type = mutable dict, count = 24,
entries => .....A LOT OF CAMERA DATA HERE.....
}

    DPIWidth  (P) = 72
    {TIFF}    (P) = <CFBasicHash 0x28161c540 [0x1c03a95e0]>{type =    mutable dict, count = 7,
entries => .....MORE CAMERA DATA HERE.....
}

    DPIHeight (P) = 72
    {MakerApple}(P) = {
1 = 3;
10 = 0;
14 = 0;
3 =     {
    epoch = 0;
    flags = 1;
    timescale = 1000000000;
    value = 390750488472916;
};
4 = 0;
5 = 221;
6 = 211;
7 = 1;
8 =     (
    "-0.04894018",
    "-0.6889497",
    "-0.7034443"
);
9 = 0;
}
formatDescription = <CMVideoFormatDescription 0x280ddc780 [0x1c03a95e0]> {
mediaType:'vide'
mediaSubType:'BGRA'
mediaSpecific: {
    codecType: 'BGRA'       dimensions: 720 x 1280
}
extensions: {<CFBasicHash 0x28161f880 [0x1c03a95e0]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1c0917068 [0x1c03a95e0]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1c09170a8 [0x1c03a95e0]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1c09171c8 [0x1c03a95e0]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
2 : <CFString 0x1c093f348 [0x1c03a95e0]>{contents = "CVBytesPerRow"} = <CFNumber 0x81092876519e5903 [0x1c03a95e0]>{value = +2880, type = kCFNumberSInt32Type}
3 : <CFString 0x1c093f3c8 [0x1c03a95e0]>{contents = "Version"} = <CFNumber 0x81092876519eed23 [0x1c03a95e0]>{value = +2, type = kCFNumberSInt32Type}
5 : <CFString 0x1c0917148 [0x1c03a95e0]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
    {PTS = {390750488483992/1000000000 = 390750.488}, DTS = {INVALID}, duration = {INVALID}},
}
imageBuffer = 0x2832ad2c0

================================================== ==
//AVCaptureVideoDataOutput Delegates
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

if (connection == videoConnection)
{

    if (self.outputVideoFormatDescription == NULL )
    {
        self.outputVideoFormatDescription   =   CMSampleBufferGetFormatDescription(sampleBuffer);
        [self   setupVideoRecorder];
    }
    else    if (self.status==RecorderRecording)
    {
        NSLog(@"%s : self.outputVideoFormatDescription - %@",__FUNCTION__,self.outputVideoFormatDescription);

        [self.cmDelegate    manager:self capturedFrameBuffer:sampleBuffer];
        NSLog(@"%s : sampleBuffer - %@",__FUNCTION__,sampleBuffer);

        dispatch_async(vidWriteQueue, ^
            {
                if  (!self->wroteFirstFrame)
                {
                    CMTime sampleTime   =   CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
                    NSLog(@"%s : sampleTime value - %lld, timescale - %i, epoch - %lli, flags - %u",__FUNCTION__,sampleTime.value, sampleTime.timescale, sampleTime.epoch, sampleTime.flags);

                    [self->assetWriter  startSessionAtSourceTime:sampleTime];
                    self->wroteFirstFrame   =   YES;
                }
                if  (self->videoAWInput.readyForMoreMediaData)
                    //else if   (self->videoAWInput.readyForMoreMediaData)
                {
                    BOOL appendSuccess  =   [self->videoAWInput appendSampleBuffer:sampleBuffer];
                    NSLog(@"%s : appendSuccess - %i",__FUNCTION__,appendSuccess);

                    if (!appendSuccess) NSLog(@"%s : failed to append video buffer - %@@",__FUNCTION__,self->assetWriter.error.localizedDescription);
                }
            });
    }
    else if (connection == audioConnection)
    {
    }
}

}

最佳答案

我的坏...我的问题是我使用AVCaptureDataOutput setSampleBufferDelegate:queue:中已经声明的线程生成了帧捕获。递归地将进程放在同一线程中的线程上。如果像我这样的另一个白痴犯同样的愚蠢错误,则发布答案。

关于ios - AVAssetWriter startSessionAtSourceTime不接受CMTIme值,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/52855185/

10-12 03:04