CMSampleBufferRef的深

CMSampleBufferRef的深

我正在尝试为音频和视频连接执行CMSampleBufferRef的深层副本?我需要使用此缓冲区进行延迟处理。有人可以在这里指向示例代码吗?

谢谢

最佳答案

我解决这个问题

我需要长时间访问示例数据。

尝试多种方式:
CVPixelBufferRetain ----->程序损坏CVPixelBufferPool ----->程序损坏CVPixelBufferCreateWithBytes ---->它可以解决此程序,但这会降低性能,不建议苹果这样做
CMSampleBufferCreateCopy --->还可以,苹果推荐了。

列表:为了保持最佳性能,某些示例缓冲区直接引用了可能需要由设备系统和其他捕获输入重用的内存池。对于未压缩的设备本机捕获,通常是这种情况,在这种情况下,应尽可能少地复制内存块。如果多个样本缓冲区引用此类内存池的时间过长,则输入将不再能够将新样本复制到内存中,并且这些样本将被丢弃。如果您的应用程序通过将提供的CMSampleBuffer对象保留太长时间而导致删除样本,但是它需要长时间访问样本数据,请考虑将数据复制到新缓冲区中,然后在样本缓冲区上调用CFRelease (如果以前已保留),则可以重用它引用的内存。

REF:https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1390096-captureoutput

那可能就是你所需要的:

实用标记-captureOutput

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
    if (connection == m_videoConnection) {
        /* if you did not read m_sampleBuffer ,here you must CFRelease m_sampleBuffer, it is causing samples to be dropped
        */
        if (m_sampleBuffer) {
            CFRelease(m_sampleBuffer);
            m_sampleBuffer = nil;
        }

        OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer, &m_sampleBuffer);
        if (noErr != status) {
            m_sampleBuffer = nil;
        }
        NSLog(@"m_sampleBuffer = %p sampleBuffer= %p",m_sampleBuffer,sampleBuffer);
    }
}

pragma mark -get CVPixelBufferRef可以长时间使用
- (ACResult) readVideoFrame: (CVPixelBufferRef *)pixelBuffer{
    while (1) {
        dispatch_sync(m_readVideoData, ^{
            if (!m_sampleBuffer) {
                _readDataSuccess = NO;
                return;
            }

            CMSampleBufferRef sampleBufferCopy = nil;
            OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, m_sampleBuffer, &sampleBufferCopy);
            if ( noErr == status)
            {
                 CVPixelBufferRef buffer  = CMSampleBufferGetImageBuffer(sampleBufferCopy);

                 *pixelBuffer = buffer;

                 _readDataSuccess = YES;

                 NSLog(@"m_sampleBuffer = %p ",m_sampleBuffer);

                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;

             }
             else{
                 _readDataSuccess = NO;
                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;
             }
        });

        if (_readDataSuccess) {
            _readDataSuccess = NO;
            return ACResultNoErr;
        }
        else{
            usleep(15*1000);
            continue;
        }
    }
}

那么您可以这样使用它:
-(void)getCaptureVideoDataToEncode{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(){
        while (1) {
            CVPixelBufferRef buffer = NULL;
            ACResult result= [videoCapture readVideoFrame:&buffer];
            if (ACResultNoErr == result) {
                ACResult error = [videoEncode encoder:buffer outputPacket:&streamPacket];
                if (buffer) {
                    CVPixelBufferRelease(buffer);
                    buffer = NULL;
                }
                if (ACResultNoErr == error) {
                NSLog(@"encode success");
                }
            }
        }
    });
}

10-08 05:20