我有一个CVPixelBufferRefAVAsset。我正在尝试将CIFilter应用于它。我使用这些行:

CVPixelBufferRef pixelBuffer = ...
CVPixelBufferRef newPixelBuffer = // empty pixel buffer to fill
CIContex *context = // CIContext created from EAGLContext
CGAffineTransform preferredTransform = // AVAsset track preferred transform
CIImage *phase1 = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *phase2 = [phase1 imageByApplyingTransform:preferredTransform];
CIImage *phase3 = [self applyFiltersToImage:phase2];

[context render:phase3 toCVPixelBuffer:newPixelBuffer bounds:phase3.extent colorSpace:CGColorSpaceCreateDeviceRGB()];

不幸的是,我得到的结果方向不正确。例如,以人像模式拍摄的视频颠倒了。我猜问题出在从AVAssetCoreImage坐标系(在XCode中显示对phase2的预览也会显示不正确的结果)。如何解决?

最佳答案

我通过这样做解决了它,它应该将所有内容正确地定向到坐标空间

var preferredTransform = inst.preferredTransform
preferredTransform.b *= -1
preferredTransform.c *= -1

var outputImage = CIImage(cvPixelBuffer: videoFrameBuffer)
                    .applying(preferredTransform)
outputImage = outputImage.applying(CGAffineTransform(translationX: -outputImage.extent.origin.x, y: -outputImage.extent.origin.y))

07-23 05:28