我正在尝试使用ffmpeg的libav *库将iPhone的相机帧编码为H.264视频。我发现在这个
Apple's article如何将CMSampleBuffer转换为UIImage,但是如何将其转换为ffmpeg的AVPicture?

谢谢。

最佳答案

回答我自己的问题:

CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);

// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

// Do something with the raw pixels here
// ...

// Fill in the AVFrame
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

AVFrame *pFrame;
pFrame = avcodec_alloc_frame();

avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);

现在pFrame填充了样本缓冲区的内容,该缓冲区使用像素格式kCVPixelFormatType_32BGRA

这解决了我的问题。谢谢。

关于iphone - 如何将CMSampleBuffer/UIImage转换为ffmpeg的AVPicture?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/4499160/

10-10 18:11