我正在尝试在iOS中创建视频合成,将CIFilter的应用程序和Core Animation Layer一次结合。这两个操作都是单独工作的,但是尝试将它们一次合并在一起似乎无效。

当使用AVMutableVideoComposition(asset:applyingCIFiltersWithHandler :)时,animationTool参数似乎被忽略了。其他人有没有经历过?我见过有人建议在AVMutableVideoComposition回调过程中添加任何额外的CA层,但是我的CALayer中包含一些动画,因此我看不到它如何可靠地工作。

这是我正在使用的代码:

        let clipVideoTrack = asset.tracks(withMediaType:AVMediaTypeVideo)[0]
        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        let videoRange = CMTimeRangeMake(startTime ?? kCMTimeZero, CMTimeSubtract( stopTime ?? asset.duration, startTime ?? kCMTimeZero ) )
        try compositionVideoTrack.insertTimeRange(videoRange, of: clipVideoTrack, at: kCMTimeZero)
        let parentLayer = CALayer()
        let videoLayer = CALayer()
        let overlayLayer = CALayer()

        let targetDimention: CGFloat = 900.0
        let videoWidthDivisor = clipVideoTrack.naturalSize.width / targetDimention
        let actualDimention = clipVideoTrack.naturalSize.width / videoWidthDivisor;
        let targetVideoSize = CGSize(width: actualDimention, height: actualDimention)

        parentLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        videoLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        overlayLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)

        parentLayer.addSublayer(videoLayer)

        for annotation in mediaAnnotationContainerView.mediaAnnotationViews
        {
            let renderableLayer = annotation.renderableCALayer(targetSize: targetVideoSize)
            parentLayer.addSublayer(renderableLayer)
        }


        let filter = CIFilter(name: "CISepiaTone")!
        filter.setDefaults()
        let videoComp = AVMutableVideoComposition(asset: asset, applyingCIFiltersWithHandler:
        {   request in
            let source = request.sourceImage.clampingToExtent()
            filter.setValue(source, forKey: kCIInputImageKey)
            let output = filter.outputImage!.cropping(to: request.sourceImage.extent)
            request.finish(with: output, context: nil)
        })

        videoComp.renderSize = targetVideoSize

        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

        let url = AVAsset.tempMovieUrl

        let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
        exporter?.outputURL = url
        exporter?.outputFileType = AVFileTypeMPEG4
        exporter?.shouldOptimizeForNetworkUse = true
        exporter?.videoComposition = videoComp

        exporter?.exportAsynchronously
        {
            print( "Export completed" )
        }

似乎videoComp.instructions [0]是 private 的AVCoreImageFilterVideoCompositionInstruction类。替换此操作将创建一个异常,并添加其他指令将导致导出完成,而无需实际执行任何操作。

可能是我无法做到的,实际上我必须对视频进行2次传递(一个传递给CIFilter,另一个传递给CALayers)。但是,处理一个临时输出文件,然后以2遍方式再次处理,感觉不对。

有谁知道如何使它工作?

谢谢,

射线

最佳答案

1,您是否在模拟器上运行代码?
它无法在模拟器上将动画图层渲染为视频(图层的背景可以)。

2,如果您自己创建AVVideoCompositionInstruction,请确保将enablePostProcessing设置为YES。

关于ios - 将AVMutableVideoComposition与CIFilter一起使用会忽略AVVideoCompositionCoreAnimationTool animationTool参数,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/44375157/

10-10 22:38