我使用 opengl es 在 iPad 上显示 bgr24 数据,我对 opengl es 不熟悉,所以在显示视频部分我使用 RosyWriter 一个 APPLE 示例中的代码。它有效,但 CVOpenGLESTextureCacheCreateTextureFromImage 函数花费超过 30 毫秒,而在 RosyWriter 中
它的成本可以忽略不计。
我所做的是首先将 BGR24 转换为 BGRA 像素格式,然后使用 CVPixelBufferCreateWithBytes 函数创建一个 CVPixelBufferRef,然后通过 CVOpenGLESTextureCacheCreateTextureFromImage 获取一个 CVOpenGLESTextureRef。我的代码如下,

- (void)transformBGRToBGRA:(const UInt8 *)pict width:(int)width height:(int)height
{
rgb.data = (void *)pict;

vImage_Error error = vImageConvert_RGB888toARGB8888(&rgb,NULL,0,&argb,NO,kvImageNoFlags);
if (error != kvImageNoError) {
    NSLog(@"vImageConvert_RGB888toARGB8888 error");
}

const uint8_t permuteMap[4] = {1,2,3,0};

error = vImagePermuteChannels_ARGB8888(&argb,&bgra,permuteMap,kvImageNoFlags);
if (error != kvImageNoError) {
    NSLog(@"vImagePermuteChannels_ARGB8888 error");
}

free((void *)pict);
}

转换后会生成CVPixelBufferRef,代码如下,
[self transformBGRToBGRA:pict width:width height:height];

CVPixelBufferRef pixelBuffer;
CVReturn err = CVPixelBufferCreateWithBytes(NULL,
                             width,
                             height,
                             kCVPixelFormatType_32BGRA,
                             (void*)bgraData,
                             bytesByRow,
                             NULL,
                             0,
                             NULL,
                             &pixelBuffer);

if(!pixelBuffer || err)
{
    NSLog(@"CVPixelBufferCreateWithBytes failed (error: %d)", err);
    return;
}

CVOpenGLESTextureRef texture = NULL;
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                            videoTextureCache,
                                                            pixelBuffer,
                                                            NULL,
                                                            GL_TEXTURE_2D,
                                                            GL_RGBA,
                                                            width,
                                                            height,
                                                            GL_BGRA,
                                                            GL_UNSIGNED_BYTE,
                                                            0,
                                                            &texture);


if (!texture || err) {
    NSLog(@"CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);
    CVPixelBufferRelease(pixelBuffer);
    return;
}

其他代码几乎类似于 RosyWriter 示例,包括着色器。所以我想知道为什么
如何解决这个问题。

最佳答案

这几天研究了一下,发现为什么CVOpenGLESTextureCacheCreateTextureFromImage要花很多时间,数据大的时候这里是3M,的分配、复制和移动操作,成本相当可观,尤其是Copy操作。然后使用 像素缓冲池 CVOpenGLESTextureCacheCreateTextureFromImage 的性能从 30ms 大大提高到 5ms,与 glTexImage2D() 处于同一水平。我的解决方案如下:

NSMutableDictionary*     attributes;
attributes = [NSMutableDictionary dictionary];


[attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithInt:videoWidth] forKey: (NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithInt:videoHeight] forKey: (NSString*)kCVPixelBufferHeightKey];

CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &bufferPool);

CVPixelBufferPoolCreatePixelBuffer (NULL,bufferPool,&pixelBuffer);

CVPixelBufferLockBaseAddress(pixelBuffer,0);

UInt8 * baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer);

memcpy(baseAddress, bgraData, bytesByRow * videoHeight);

CVPixelBufferUnlockBaseAddress(pixelBuffer,0);

使用这个新创建的 pixelBuffer 你可以让它更快。

属性中添加以下配置, 可以使其性能达到最佳,小于 1ms。
 NSDictionary *IOSurfaceProperties = [NSDictionary dictionaryWithObjectsAndKeys:
                                                                        [NSNumber numberWithBool:YES], @"IOSurfaceOpenGLESFBOCompatibility",[NSNumber numberWithBool:YES], @"IOSurfaceOpenGLESTextureCompatibility",nil];

[attributes setObject:IOSurfaceProperties forKey:(NSString*)kCVPixelBufferIOSurfacePropertiesKey];

10-08 17:48