本文介绍了从CVPixelBuffer创建CMSampleBuffer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我从ARSessionDelegate获得了一个CVPixelBuffer:
I get a CVPixelBuffer from ARSessionDelegate:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
frame.capturedImage // CVPixelBufferRef
}
但是我的应用程序的另一部分(我无法更改)使用CMSampleBuffer.
But another part of my app (that I can't change) uses a CMSampleBuffer.
CMSampleBuffer是CVPixelBuffer的容器.
CMSampleBuffer is a container of CVPixelBuffer.
为了创建CMSampleBuffer,我可以使用以下功能:
In order to create a CMSampleBuffer I can use this function:
func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?,
_ imageBuffer: CVImageBuffer,
_ formatDescription: CMVideoFormatDescription,
_ sampleTiming: UnsafePointer<CMSampleTimingInfo>,
_ sBufOut: UnsafeMutablePointer<CMSampleBuffer?>) -> OSStatus
我唯一缺少的参数是sampleTiming
-如何从CVPixelBuffer中提取该参数?
The only missing parameter for me is sampleTiming
- how can I extract that from CVPixelBuffer?
推荐答案
sampleTiming主要包含presentationTimeStamp
,您可以通过以下代码轻松创建它:
The sampleTiming mainly contains the presentationTimeStamp
, You can easy create it by following codes:
let scale = CMTimeScale(NSEC_PER_SEC)
let pts = CMTime(value: CMTimeValue(frame.timestamp * Double(scale)),
timescale: scale)
var timingInfo = CMSampleTimingInfo(duration: kCMTimeInvalid,
presentationTimeStamp: pts,
decodeTimeStamp: kCMTimeInvalid)
这篇关于从CVPixelBuffer创建CMSampleBuffer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!