何从UIimageView中缩放的UIimage获取位置的像素颜

何从UIimageView中缩放的UIimage获取位置的像素颜

本文介绍了如何从UIimageView中缩放的UIimage获取位置的像素颜色的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在使用这种技术来获取UIimage中像素的颜色. (在Ios上)

I'm currently using this technique to get the color of a pixel in a UIimage. (on Ios)

- (UIColor*) getPixelColorAtLocation:(CGPoint)point {
UIColor* color = nil;
CGImageRef inImage = self.image.CGImage;
// Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage];
if (cgctx == NULL) { return nil; /* error */ }

size_t w = CGImageGetWidth(inImage);
size_t h = CGImageGetHeight(inImage);
CGRect rect = {{0,0},{w,h}};

// Draw the image to the bitmap context. Once we draw, the memory
// allocated for the context for rendering will then contain the
// raw image data in the specified color space.
CGContextDrawImage(cgctx, rect, inImage);

// Now we can get a pointer to the image data associated with the bitmap
// context.
unsigned char* data = CGBitmapContextGetData (cgctx);
if (data != NULL) {
    //offset locates the pixel in the data from x,y.
    //4 for 4 bytes of data per pixel, w is width of one row of data.
    int offset = 4*((w*round(point.y))+round(point.x));
    int alpha =  data[offset];
    int red = data[offset+1];
    int green = data[offset+2];
    int blue = data[offset+3];
    NSLog(@"offset: %i colors: RGB A %i %i %i  %i",offset,red,green,blue,alpha);
    color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
}

// When finished, release the context
CGContextRelease(cgctx);
// Free image data memory for the context
if (data) { free(data); }
return color;

}

如此处所示;

http://www.markj.net/iphone-uiimage-pixel-color /

它工作得很好,但是当使用大于UIImageView的图像时,它将失败.我尝试添加图像并更改缩放模式以适合视图.如何将代码修改为,以便仍然可以使用缩放后的图像对像素颜色进行采样.

it works quite well, but when working with images larger than the UIImageView it fails. I tried adding an image and changing the scaling mode to fit the view. How would I modify the code to so that it would still be able to sample the pixel color with a scaled image.

推荐答案

尝试使用swift3

func getPixelColor(image: UIImage, x: Int, y: Int, width: CGFloat) -> UIColor
{

    let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
    let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

    let pixelInfo: Int = ((Int(width) * y) + x) * 4

    let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
    let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
    let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
    let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

    return UIColor(red: r, green: g, blue: b, alpha: a)
}

这篇关于如何从UIimageView中缩放的UIimage获取位置的像素颜色的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-01 07:14