我通过以下方式从id<MTLTexture>创建CVPixelBufferRef对象:

id<MTLTexture> CreateMTLTexrure(CVMetalTextureCacheRef texture_cache,
                                CVPixelBufferRef pixel_buffer,
                                MTLPixelFormat metal_pixel_format, size_t plane,
                                int height, int width) {
  CVMetalTextureRef texture_ref;

  CVReturn err = CVMetalTextureCacheCreateTextureFromImage(
      kCFAllocatorDefault, texture_cache, pixel_buffer, NULL,
      metal_pixel_format, width, height, plane, &texture_ref);

  if (err != kCVReturnSuccess) {
    // throw error
    return nil;
  }

  id<MTLTexture> texture = CVMetalTextureGetTexture(texture_ref);

  //
  // Q: is it safe to do CVBufferRelease(texture_ref) here?
  //
  return texture;
}


什么时候应该释放CVMetalTextureRef对象?
获得MTLTexture之后释放它是否安全?

最佳答案

是的,这很安全。
根据Apple Documentation Archive中的sample code,它是在分配后立即发布的,如下所示:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVReturn error;

    CVImageBufferRef sourceImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t width = CVPixelBufferGetWidth(sourceImageBuffer);
    size_t height = CVPixelBufferGetHeight(sourceImageBuffer);

    CVMetalTextureRef textureRef;
    error = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, sourceImageBuffer, NULL, MTLPixelFormatBGRA8Unorm, width, height, 0, &textureRef);

    if (error)
    {
        NSLog(@">> ERROR: Couldnt create texture from image");
        assert(0);
    }

    _videoTexture[_constantDataBufferIndex] = CVMetalTextureGetTexture(textureRef);
    if (!_videoTexture[_constantDataBufferIndex]) {
        NSLog(@">> ERROR: Couldn't get texture from texture ref");
        assert(0);
    }

    CVBufferRelease(textureRef);
}


您还可以查看Google Chromium存储库中的WebRTC source code,它们也以相同的方式发布CVMetalTextureRef。

- (BOOL)setupTexturesForFrame:(nonnull RTCVideoFrame *)frame {
  RTC_DCHECK([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]);
  if (![super setupTexturesForFrame:frame]) {
    return NO;
  }
  CVPixelBufferRef pixelBuffer = ((RTCCVPixelBuffer *)frame.buffer).pixelBuffer;
  id<MTLTexture> lumaTexture = nil;
  id<MTLTexture> chromaTexture = nil;
  CVMetalTextureRef outTexture = nullptr;
  // Luma (y) texture.
  int lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
  int lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0);
  int indexPlane = 0;
  CVReturn result = CVMetalTextureCacheCreateTextureFromImage(
      kCFAllocatorDefault, _textureCache, pixelBuffer, nil, MTLPixelFormatR8Unorm, lumaWidth,
      lumaHeight, indexPlane, &outTexture);
  if (result == kCVReturnSuccess) {
    lumaTexture = CVMetalTextureGetTexture(outTexture);
  }
  // Same as CFRelease except it can be passed NULL without crashing.
  CVBufferRelease(outTexture);
  outTexture = nullptr;
  // Chroma (CrCb) texture.
  indexPlane = 1;
  result = CVMetalTextureCacheCreateTextureFromImage(
      kCFAllocatorDefault, _textureCache, pixelBuffer, nil, MTLPixelFormatRG8Unorm, lumaWidth / 2,
      lumaHeight / 2, indexPlane, &outTexture);
  if (result == kCVReturnSuccess) {
    chromaTexture = CVMetalTextureGetTexture(outTexture);
  }
  CVBufferRelease(outTexture);
  if (lumaTexture != nil && chromaTexture != nil) {
    _yTexture = lumaTexture;
    _CrCbTexture = chromaTexture;
    return YES;
  }
  return NO;
}

关于objective-c++ - 什么时候应该发布CVMetalTextureRef?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/53770361/

10-16 07:17