当我应用高斯模糊时,边缘仍然像case

我知道问题是图像尺寸。模糊效果变得比源图像大。但是我不知道如何应用原始图像大小

这是我的代码:

func applyBlurEffect(image: UIImage) -> UIImage {
    let imageToBlur = CIImage(image: image)
    let blurfilter = CIFilter(name: "CIGaussianBlur")
    blurfilter!.setValue(imageToBlur, forKey: "inputImage")
    let resultImage = blurfilter!.value(forKey: "outputImage") as! CIImage
    let croppedImage: CIImage = resultImage.cropping(to: CGRect(x: 0, y: 0, width: imageToBlur!.extent.size.width, height: imageToBlur!.extent.size.height))
    let context = CIContext(options: nil)
    let blurredImage = UIImage(cgImage: context.createCGImage(croppedImage, from: croppedImage.extent)!)
    return blurredImage
}

我想应用在func中输入的图片大小。有人知道吗?

最佳答案

1.
我不知道这是否适合您的需求,但是您可以像苹果在通知中心(深色,浅色或非常浅色)中那样尝试UIBlurEffect:

func applyBlurEffectTo(imageView: UIImageView) {
    guard !UIAccessibilityIsReduceTransparencyEnabled() else {
        return
    }

    imageView.backgroundColor = .clear

    let blurEffect = UIBlurEffect(style: .light)
    let blurEffectView = UIVisualEffectView(effect: blurEffect)
    blurEffectView.frame = imageView.bounds
    blurEffectView.autoresizingMask = [.flexibleWidth, .flexibleHeight]

    imageView.addSubview(blurEffectView)
}

(我认为它太亮了,效果的.dark选项太暗了:)您将要删除色。)

2.
或使用此UIImage extension(有点过大但可以100%起作用,部分是第一种情况在后台实现的方式):
import UIKit
import Accelerate

class ViewController: UIViewController {

    @IBOutlet weak var imageView: UIImageView!

    override func viewDidLoad() {
        super.viewDidLoad()

        imageView.image = imageView.image?.applyBlurWithRadius(5)
    }
}

extension UIImage {
    func applyBlurWithRadius(_ blurRadius: CGFloat) -> UIImage? {
        if (size.width < 1 || size.height < 1) { return nil }
        guard let cgImage = self.cgImage else {  return nil }

        let __FLT_EPSILON__ = CGFloat(FLT_EPSILON)
        let screenScale = UIScreen.main.scale
        let imageRect = CGRect(origin: CGPoint.zero, size: size)
        var effectImage = self

        let hasBlur = blurRadius > __FLT_EPSILON__

        if hasBlur {
            UIGraphicsBeginImageContextWithOptions(size, false, screenScale)
            guard let effectInContext = UIGraphicsGetCurrentContext() else { return  nil }

            effectInContext.scaleBy(x: 1.0, y: -1.0)
            effectInContext.translateBy(x: 0, y: -size.height)
            effectInContext.draw(cgImage, in: imageRect)

            var effectInBuffer = createEffectBuffer(effectInContext)

            UIGraphicsBeginImageContextWithOptions(size, false, screenScale)

            guard let effectOutContext = UIGraphicsGetCurrentContext() else { return  nil }
            var effectOutBuffer = createEffectBuffer(effectOutContext)

            if hasBlur {
                let inputRadius = blurRadius * screenScale
                let d = floor(inputRadius * 3.0 * CGFloat(sqrt(2 * M_PI) / 4 + 0.5))
                var radius = UInt32(d)
                if radius % 2 != 1 {
                    radius += 1
                }

                let imageEdgeExtendFlags = vImage_Flags(kvImageEdgeExtend)
                vImageBoxConvolve_ARGB8888(&effectInBuffer, &effectOutBuffer, nil, 0, 0, radius, radius, nil, imageEdgeExtendFlags)
                vImageBoxConvolve_ARGB8888(&effectOutBuffer, &effectInBuffer, nil, 0, 0, radius, radius, nil, imageEdgeExtendFlags)
                vImageBoxConvolve_ARGB8888(&effectInBuffer, &effectOutBuffer, nil, 0, 0, radius, radius, nil, imageEdgeExtendFlags)
            }

            effectImage = UIGraphicsGetImageFromCurrentImageContext()!
            UIGraphicsEndImageContext()
        }

        UIGraphicsBeginImageContextWithOptions(size, false, screenScale)

        guard let outputContext = UIGraphicsGetCurrentContext() else { return nil }

        outputContext.scaleBy(x: 1.0, y: -1.0)
        outputContext.translateBy(x: 0, y: -size.height)
        outputContext.draw(cgImage, in: imageRect)

        if hasBlur {
            outputContext.saveGState()
            outputContext.draw(effectImage.cgImage!, in: imageRect)
            outputContext.restoreGState()
        }

        let outputImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

        return outputImage
    }

    private func createEffectBuffer(_ context: CGContext) -> vImage_Buffer {
        let data = context.data
        let width = vImagePixelCount(context.width)
        let height = vImagePixelCount(context.height)
        let rowBytes = context.bytesPerRow
        return vImage_Buffer(data: data, height: height, width: width, rowBytes: rowBytes)
    }
}

第二选项结果:
ios - 我不想扩展CIFilter的CIGaussianBlur-LMLPHP

关于ios - 我不想扩展CIFilter的CIGaussianBlur,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/43026345/

10-11 14:42