我已经在这里提出了一个没有任何回应的问题:
How do I record changes on a CIImage to a video using AVAssetWriter?

但是也许我的问题需要更简单。我的Google搜索无济于事。如何在不使用相机的情况下实时捕获变化的CIImage的视频?

使用captureOutput,我得到了一个CMSampleBuffer,可以将其制成CVPixelBuffer。 AVAssetWriterInput的mediaType设置为video,但我认为它需要压缩的视频。另外,我不清楚AVAssetWriterInput是否将ExpectsMediaDataInRealTime属性设置为true。

似乎应该很简单,但是我尝试进行的所有操作都会使AVAssetWriter的状态失败。

这是我完成这项工作的最后尝试。仍然失败:

@objc func importLivePreview(){

    guard var importedImage = importedDryCIImage else { return }

    DispatchQueue.main.async(){

        // apply filter to camera image
        // this is what makes the CIImage appear that it is changing
        importedImage = self.applyFilterAndReturnImage(ciImage: importedImage, orientation: UIImage.Orientation.right, currentCameraRes:currentCameraRes!)


        if self.videoIsRecording &&
           self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true {

            guard let writer: AVAssetWriter = self.assetWriter, writer.status == .writing else {
                return
            }

            guard let cv:CVPixelBuffer = self.buffer(from: importedImage) else {
                print("CVPixelBuffer could not be created.")
                return
            }

            self.MTLContext?.render(_:importedImage, to:cv)

            self.currentSampleTime = CMTimeMakeWithSeconds(0.1, preferredTimescale: 1000000000)

            guard let currentSampleTime = self.currentSampleTime else {
                return
            }

            let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)

            if success == false {
                print("Pixel Buffer input failed")
            }

        }

        guard let MTLView = self.MTLCaptureView else {
            print("MTLCaptureView is not found or nil.")
            return
        }

        // update the MTKView with the changed CIImage so the user can see the changed image
        MTLView.image = importedImage


    }

}

最佳答案

我知道了问题是我没有抵消currentSampleTime。此示例没有准确的偏移量,但是它表明需要在最后一次添加偏移量。

@objc func importLivePreview(){

    guard var importedImage = importedDryCIImage else { return }

    DispatchQueue.main.async(){

        // apply filter to camera image
        // this is what makes the CIImage appear that it is changing
        importedImage = self.applyFilterAndReturnImage(ciImage: importedImage, orientation: UIImage.Orientation.right, currentCameraRes:currentCameraRes!)


        if self.videoIsRecording &&
           self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true {

            guard let writer: AVAssetWriter = self.assetWriter, writer.status == .writing else {
                return
            }

            guard let cv:CVPixelBuffer = self.buffer(from: importedImage) else {
                print("CVPixelBuffer could not be created.")
                return
            }

            self.MTLContext?.render(_:importedImage, to:cv)

            guard let currentSampleTime = self.currentSampleTime else {
                return
            }

            // offset currentSampleTime
            let sampleTimeOffset = CMTimeMakeWithSeconds(0.1, preferredTimescale: 1000000000)

            self.currentSampleTime = CMTimeAdd(currentSampleTime, sampleTimeOffset)

            print("currentSampleTime = \(String(describing: currentSampleTime))")

            let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)

            if success == false {
                print("Pixel Buffer input failed")
            }

        }

        guard let MTLView = self.MTLCaptureView else {
            print("MTLCaptureView is not found or nil.")
            return
        }

        // update the MTKView with the changed CIImage so the user can see the changed image
        MTLView.image = importedImage


    }

}

关于ios - 如何在不使用相机的情况下捕获变化的CIImage的视频?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/58739385/

10-09 12:32