问题描述
我尝试使用可可从网络摄像头抓取图像。我可以使用QTKit和didOutputVideoFrame代理调用获得RGBA格式的图像,并将CVImageBuffer转换为CIImage,然后转换为NSBitmapImageRep。
I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.
我的相机在YUV本地抓取,我想要的是直接从CVImageBuffer获取YUV数据,并在显示之前处理YUV帧。
I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.
我的问题是:如何从CVImageBuffer获取YUV数据?
My question is: How can I get the YUV data from the CVImageBuffer?
推荐答案
我问我是不是一个不同的问题,但结果是有同样的答案, href =http://stackoverflow.com/questions/5626703/raw-data-from-cvimagebuffer-without-rendering/5630737#5630737>来自CVImageBuffer的原始数据未进行渲染?
I asked what I thought was a different question, but it turned out to have the same answer as this one: raw data from CVImageBuffer without rendering?
这篇关于如何从CVImageBuffer对象获取原始数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!