本文介绍了CMSampleBufferRef到位图?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在玩苹果网站(Xcode项目)上的AVScreenShack示例,该示例捕获桌面并将捕获的内容实时显示在窗口中.

I'm playing around with the AVScreenShack example from Apple's website (Xcode project) which captures the desktop and displays the capture in a window in quasi real-time.

我对项目做了一些修改,并插入了以下代码行:

I have modified the project a little bit and inserted this line of code:

-(void)captureOutput:(AVCaptureOutput*) captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection*) connection
{
 ...
}

我的问题是:如何将CMSampleBufferRef实例转换为CGImageRef?

My question is : How do I convert the CMSampleBufferRef instance to CGImageRef ?

谢谢.

推荐答案

这里是如何从CMSampleBufferRef创建UIImage的方法.当在AVCaptureStillImageOutput上响应captureStillImageAsynchronouslyFromConnection:completionHandler:时,这对我有用.

Here is how you can create a UIImage from a CMSampleBufferRef. This worked for me when responding to captureStillImageAsynchronouslyFromConnection:completionHandler: on AVCaptureStillImageOutput.

// CMSampleBufferRef imageDataSampleBuffer;
CMBlockBufferRef buff = CMSampleBufferGetDataBuffer(imageDataSampleBuffer);
size_t len = CMBlockBufferGetDataLength(buff);
char * data = NULL;
CMBlockBufferGetDataPointer(buff, 0, NULL, &len, &data);
NSData * d = [[NSData alloc] initWithBytes:data length:len];
UIImage * img = [[UIImage alloc] initWithData:d];

CMBlockBufferGetDataPointer发出的数据似乎是JPEG数据.

It looks like the data coming out of CMBlockBufferGetDataPointer is JPEG data.

更新:要完全回答您的问题,您可以从我的代码中调用img上的CGImage来实际获得CGImageRef.

UPDATE: To fully answer your question, you can call CGImage on img from my code to actually get a CGImageRef.

这篇关于CMSampleBufferRef到位图?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

05-28 13:32