本文介绍了从 CVPixelBuffer 创建一个 CMSampleBuffer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我从 ARSessionDelegate 得到一个 CVPixelBuffer:

I get a CVPixelBuffer from ARSessionDelegate:

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    frame.capturedImage // CVPixelBufferRef
}

但我的应用程序的另一部分(我无法更改)使用 CMSampleBuffer.

But another part of my app (that I can't change) uses a CMSampleBuffer.

CMSampleBuffer 是 CVPixelBuffer 的容器.

CMSampleBuffer is a container of CVPixelBuffer.

为了创建一个 CMSampleBuffer 我可以使用这个函数:

In order to create a CMSampleBuffer I can use this function:

func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?,
                                            _ imageBuffer: CVImageBuffer,
                                            _ formatDescription: CMVideoFormatDescription,
                                            _ sampleTiming: UnsafePointer<CMSampleTimingInfo>,
                                            _ sBufOut: UnsafeMutablePointer<CMSampleBuffer?>) -> OSStatus

我唯一缺少的参数是 sampleTiming - 我如何从 CVPixelBuffer 中提取它?

The only missing parameter for me is sampleTiming - how can I extract that from CVPixelBuffer?

推荐答案

sampleTiming主要包含presentationTimeStamp,可以通过以下代码轻松创建:

The sampleTiming mainly contains the presentationTimeStamp, You can easy create it by following codes:

let scale = CMTimeScale(NSEC_PER_SEC)
let pts = CMTime(value: CMTimeValue(frame.timestamp * Double(scale)),
                 timescale: scale)
var timingInfo = CMSampleTimingInfo(duration: kCMTimeInvalid,
                                    presentationTimeStamp: pts,
                                    decodeTimeStamp: kCMTimeInvalid)

这篇关于从 CVPixelBuffer 创建一个 CMSampleBuffer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-23 19:07