我目前正在尝试使用YUV420格式(双平面)以openGL绘制图像。我收到原始数据,并尝试将其解析为CVPixelBuffer,然后使用CVOpenGLESTextureCacheCreateTextureFromImage传递所述缓冲区。虽然解析为CVPixelBuffer时没有收到任何错误,但是尝试传递给CVOpenGLESTextureCacheCreateTextureFromImage时却收到错误(-6683)。我正在尽力遵循苹果的GLCameraRipple示例代码-再次重申,我使用的是原始图像数据而不是来自相机的数据。
希望有人可以解释我在这里缺少什么-我认为这是一个缺少的属性...
仅供引用,平面0是Y平面,平面1是UV平面-UV平面应为Y平面的宽度和高度的一半。
size_t numPlanes = image->GetNumPlanes();
size_t planeWidth[numPlanes];
size_t planeHeight[numPlanes];
size_t scanWidth[numPlanes];
void *planeIndex[numPlanes];
for(int i = 0; i<numPlanes; i++){
i<1 ? planeWidth[i] = image->GetWidth() : planeWidth[i] = image->GetWidth()/2;
i<1 ? planeHeight[i] = image->GetHeight() : planeWidth[i] = image->GetHeight()/2;
scanWidth[i] = image->GetScanWidth(i);
planeIndex[i] = image->GetPlanePointer(i);
}
CVPixelBufferRef pixelBuffer;
CFDictionaryRef empty;
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault,
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs, kCVPixelBufferIOSurfacePropertiesKey, empty);
CVReturn cvError = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault,
image->GetWidth(),
image->GetHeight(),
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
nil,
nil,
numPlanes,
planeIndex,
planeWidth,
planeHeight,
scanWidth,
nil, nil, attrs, &pixelBuffer);
if(cvError) NSLog(@"Error at CVPixelBufferCreateWithPlanarBytes: %d", cvError);
CVReturn err;
size_t width = CVPixelBufferGetWidth(pixelBuffer);
size_t height = CVPixelBufferGetHeight(pixelBuffer);
if (!_videoTextureCache)
{
NSLog(@"No video texture cache");
return;
}
if (_bModel == nil ||
width != _textureWidth ||
height != _textureHeight)
{
_textureWidth = width;
_textureHeight = height;
_bModel = [[BufferModel alloc] initWithScreenWidth:_screenWidth
screenHeight:_screenHeight
meshFactor:_meshFactor
textureWidth:_textureWidth
textureHeight:_textureHeight];
[self setupBuffers];
}
[self cleanUpTextures];
// CVOpenGLESTextureCacheCreateTextureFromImage will create GLES texture
// optimally from CVImageBufferRef.
// Y-plane
glActiveTexture(GL_TEXTURE0);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RED_EXT,
_textureWidth,
_textureHeight,
GL_RED_EXT,
GL_UNSIGNED_BYTE,
0,
&_lumaTexture);
if (err)
{
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}
谢谢任何能够提供帮助的人。
虽然我知道有一个类似的问题(不完全相同),但所说的问题也已经很久了,从未收到过任何回复。我希望自己的情况多一些运气。
最佳答案
所创建的CVPixelBuffer中的iosurface属性为null。
手动创建:<CVPixelBuffer 0x1fd52790 width=1280 height=720 pixelFormat=420v iosurface=0x0 planes=2>
由CMSampleBufferGetImageBuffer创建:<CVPixelBuffer 0x1fd521e0 width=1280 height=720 pixelFormat=420f iosurface=0x21621c54 planes=2>
据我所知,没有解决方案。
关于ios - CVOpenGLESTextureCacheCreateTextureFromImage返回错误6683,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/12646273/