AVFoundation图像到视频

AVFoundation图像到视频

本文介绍了iOS5 AVFoundation图像到视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从单张图片创建一个视频,并将其保存到我的照片库中,我一直在谷歌上搜索多年 - 并且无法找到解决方案。

I'm trying to create a video from a single image, and save it to my photos library, I've been googling around for ages - and cannot find a solution.

我有这段代码:

    @autoreleasepool {
    NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/movie2.mp4"]];

    UIImage *img = [UIImage imageWithData:[[self imageDataArrya]objectAtIndex:0]imageData];
    [self writeImageAsMovie:img toPath:path size:CGSizeMake(640, 960) duration:10];

    UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
}

我在后台线程中调用上述方法。这是'writeImageAsMovie'的代码:

I call the above mentioned method in a background thread. This is the code for 'writeImageAsMovie':

- (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration {
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];
[self setInput:[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                  outputSettings:videoSettings]];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:input
                                                 sourcePixelBufferAttributes:nil];

[videoWriter addInput:input];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];

[input markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 2)];
[videoWriter finishWriting];

}

转换的实用方法CVPixelBufferRef的图像:

The utility method for converting an Image to a CVPixelBufferRef:

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                      self.view.frame.size.width,
                                      self.view.frame.size.height,
                                      kCVPixelFormatType_32ARGB,
                                      (__bridge CFDictionaryRef) options,
                                      &pxbuffer);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
                                             self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace,
                                             kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}

现在,如果我尝试从模拟器运行代码,它会给我一个错误说数据已损坏。

Now if I try to run the code from the Simulator, it gives me an error saying that the data is corrupt.

如果我在我的设备上运行它,它会将2秒的视频保存到我的照片库,但它只有绿色,我的图像不是'在那里。

If I run it on my device, it saves a 2 second video to my photo library but its only green, my image isn't in there.

任何帮助将不胜感激:)

Any help will be appreciated :)

推荐答案

我完成了这项工作 - 抱歉,我今天没有看到你的回复。
这是我使用的:

I totally got this working - sorry I didn't see your reply before today.This is what I used:

创建临时文件

 NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/flipimator-tempfile.mp4"]];

//overwrites it if it already exists.
if([fileManager fileExistsAtPath:path])
    [fileManager removeItemAtPath:path error:NULL];

调用导出图像方法将图像保存到临时文件:

[self exportImages:frames
         asVideoToPath:path
         withFrameSize:imageSize
       framesPerSecond:fps];

将临时文件保存到相册:

UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);

- (void)video:(NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
    NSLog(@"Finished saving video with error: %@", error);
    UIAlertView *alert = [[UIAlertView alloc]initWithTitle:@"Done"
                                                   message:@"Movie succesfully exported."
                                          delegate:nil
                                 cancelButtonTitle:@"OK"
                                 otherButtonTitles:nil, nil];
    [alert show];
}

exportImages方法的代码:

        - (void)exportImages:(NSArray *)imageArray
           asVideoToPath:(NSString *)path
           withFrameSize:(CGSize)imageSize
         framesPerSecond:(NSUInteger)fps {
        NSLog(@"Start building video from defined frames.");

        NSError *error = nil;

        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];
        NSParameterAssert(videoWriter);

        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
                                       nil];

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                         sourcePixelBufferAttributes:nil];

        NSParameterAssert(videoWriterInput);
        NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
        videoWriterInput.expectsMediaDataInRealTime = YES;
        [videoWriter addInput:videoWriterInput];

        //Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        CVPixelBufferRef buffer = NULL;

        //convert uiimage to CGImage.
        int frameCount = 0;

        for(UIImage * img in imageArray) {
            buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:imageSize];

            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) {
                if (adaptor.assetWriterInput.readyForMoreMediaData)  {
                    //print out status::
                    NSString *border = @"**************************************************";
                    NSLog(@"\n%@\nProcessing video frame (%d,%d).\n%@",border,frameCount,[imageArray count],border);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                    if(!append_ok){
                        NSError *error = videoWriter.error;
                        if(error!=nil) {
                            NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                        }
                    }

                }
                else {
                    printf("adaptor not ready %d, %d\n", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d\n, with error.", frameCount, j);
            }
            frameCount++;
        }

        //Finish the session:
        [videoWriterInput markAsFinished];
        [videoWriter finishWriting];
        NSLog(@"Write Ended");

    }

方法的参数


  • imageArray:UIImage的NSArray。

  • 路径:处理时写入的临时路径(temp上面定义)。

  • imageSize:视频的大小(以像素为单位)(宽度和高度)。

  • fps:应显示多少图像每秒在视频中。

  • imageArray : NSArray of UIImage.
  • path : Temporary path to write to while you process (temp defined above).
  • imageSize : The size of the video in pixels (width, and height).
  • fps : How many images should be displayed per second in the video.

希望它有所帮助!
对格式化感到抱歉 - 我我仍然是StackOverflow.com的新手。

Hope it helps!Sorry about the formatting - I'm still very new to StackOverflow.com.

这是我使用代码的地方:

This is where I used the code: http://www.youtube.com/watch?v=DDckJyF2bnA

这篇关于iOS5 AVFoundation图像到视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-04 22:48