我第一次尝试使用 Core Image(在 OS X 10.7.3 上),但遇到了砖墙。我确定这是我正在做的愚蠢的事情,只需要更熟悉该框架的人向我指出。

考虑以下代码(让我们规定 imageURL 是指向磁盘上 JPG 的有效文件 URL):

CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:
                                                  kCIInputImageKey, inputImage,
                                                  kCIInputExtentKey, [inputImage valueForKey:@"extent"],
                                                  nil];
CIImage *outputImage = (CIImage *)[filter valueForKey:@"outputImage"];

运行此代码时,最后一行触发:
0   CoreFoundation                      0x00007fff96c2efc6 __exceptionPreprocess + 198
1   libobjc.A.dylib                     0x00007fff9153cd5e objc_exception_throw + 43
2   CoreFoundation                      0x00007fff96cbb2ae -[NSObject doesNotRecognizeSelector:] + 190
3   CoreFoundation                      0x00007fff96c1be73 ___forwarding___ + 371
4   CoreFoundation                      0x00007fff96c1bc88 _CF_forwarding_prep_0 + 232
5   CoreImage                           0x00007fff8f03c38d -[CIAreaAverage outputImage] + 52
6   Foundation                          0x00007fff991d8384 _NSGetUsingKeyValueGetter + 62
7   Foundation                          0x00007fff991d8339 -[NSObject(NSKeyValueCoding) valueForKey:] + 392

现在,核心图像过滤器引用明确指出 CIAreaAverage “返回包含感兴趣区域平均颜色的单像素图​​像。”事实上,更令人困惑的是,当我检查调试器中的过滤器属性时(在尝试 valueForKey: 调用之前):
(lldb) po [filter attributes]
(id) $3 = 0x00007fb3e3ef0e00 {
    CIAttributeDescription = "Calculates the average color for the specified area in an image, returning the result in a pixel.";
    CIAttributeFilterCategories =     (
        CICategoryReduction,
        CICategoryVideo,
        CICategoryStillImage,
        CICategoryBuiltIn
    );
    CIAttributeFilterDisplayName = "Area Average";
    CIAttributeFilterName = CIAreaAverage;
    CIAttributeReferenceDocumentation = "http://developer.apple.com/cgi-bin/apple_ref.cgi?apple_ref=//apple_ref/doc/filter/ci/CIAreaAverage";
    inputExtent =     {
        CIAttributeClass = CIVector;
        CIAttributeDefault = "[0 0 640 80]";
        CIAttributeDescription = "A rectangle that specifies the subregion of the image that you want to process.";
        CIAttributeDisplayName = Extent;
        CIAttributeType = CIAttributeTypeRectangle;
        CIUIParameterSet = CIUISetBasic;
    };
    inputImage =     {
        CIAttributeClass = CIImage;
        CIAttributeDescription = "The image to process.";
        CIAttributeDisplayName = Image;
        CIUIParameterSet = CIUISetBasic;
    };
    outputImage =     {
        CIAttributeClass = CIImage;
    };
}

那里有 outputImage - 以 CIImage 类型给出!

那么,我做错了什么?我看到的所有文档和教程都表明 -valueForKey: 是访问属性的正确方法,包括 outputImage

最佳答案

我相信你的范围是罪魁祸首(不管它有多奇怪)。当我将范围更改为 CIVector* 时,它可以工作。

NSURL *imageURL = [NSURL fileURLWithPath:@"/Users/david/Desktop/video.png"];
CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage"];
[filter setValue:inputImage forKey:kCIInputImageKey];
CGRect inputExtent = [inputImage extent];
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                       Y:inputExtent.origin.y
                                       Z:inputExtent.size.width
                                       W:inputExtent.size.height];
[filter setValue:extent forKey:kCIInputExtentKey];
CIImage *outputImage = [filter valueForKey:@"outputImage"];

[inputImage extent] 返回一个 CGRect,但显然 CIVector* 效果更好。

关于objective-c - "unrecognized selector"尝试访问 CIFilter 的 outputImage 时,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/9558265/

10-10 12:50