问题描述
我正在尝试将整个视频转换为60fps的图像序列,这意味着每秒生成60个图像...
i'm trying to convert a whole video into a sequence of images at a rate of 60fps, which means 60 images generated per second of video...
为此,我正在使用AVAssetImageGenerator和generateCGImagesAsynchronouslyForTimes方法......
To do so, i'm making use of AVAssetImageGenerator and the generateCGImagesAsynchronouslyForTimes method ...
事情进展顺利,除了我有严重的性能问题,重新规划批处理执行时间(大约5分钟,13秒视频)...
Things go quite well except that i'm having serious performance issues regarging the batch processing execution time (approximately 5 mins for 13 seconds video) ...
此外,高于以下尺寸CGSizeMake(512,324),我遇到崩溃...
Moreover, above the following size CGSizeMake(512, 324), i experience crashes ...
有没有人有过这种处理的经验,知道如何减少执行时间以及能够以更高的分辨率提取图像?
Did anyone already have experience with this kind of processing and knows how to reduce this time execution as well as being able to extract the images at a higher resolution ?
以下是我正在测试的代码......
Below is the code i'm testing ...
NSURL *movieURL = [NSURL fileURLWithPath:getCaptureMoviePath()];
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:movieURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
generator.requestedTimeToleranceAfter=kCMTimeZero;
generator.requestedTimeToleranceBefore=kCMTimeZero;
NSMutableArray *thumbTimes=[NSMutableArray arrayWithCapacity:asset.duration.value];
for(int t=0;t < asset.duration.value;t=t+2) {
CMTime thumbTime = CMTimeMake(t, asset.duration.timescale);
NSLog(@"Time Scale : %d ", asset.duration.timescale);
NSValue *v=[NSValue valueWithCMTime:thumbTime];
[thumbTimes addObject:v];
}
NSLog(@"thumbTimes array contains %d objects : ", [thumbTimes count]);
[asset release];
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(@"couldn't generate thumbnail, error:%@", error);
} else {
NSLog(@"actual time: %lld/%d (requested: %lld/%d)",actualTime.value,actualTime.timescale,requestedTime.value,requestedTime.timescale);
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setDateFormat:@"yyyyMMdd-HHmmss"];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filename = [NSString stringWithFormat:@"%@.png", [formatter stringFromDate:[NSDate date]]];
NSString *filepath = [documentsDirectory stringByAppendingPathComponent:filename];
CFURLRef url = (CFURLRef)[NSURL fileURLWithPath:filepath];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
CGImageDestinationAddImage(destination, im, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(@"Failed to write image to %@", filepath);
}
CFRelease(destination);
}
//[generator release];
};
CGSize maxSize = CGSizeMake(512, 324);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:thumbTimes completionHandler:handler];
提前感谢您,
j 。
推荐答案
嘿@Sooriah Joel尝试使用以下代码。它对我来说很好。
Hey @Sooriah Joel try using following code. It is working fine for me.
- (void)generateCMTimesArrayOfAllFramesUsingAsset:(AVURLAsset *)asset
{
if (cmTimeArray.count>0) {
[cmTimeArray removeAllObjects];
}
//Generate all frames present in video
for(int t=0;t < asset.duration.value;t++) {
CMTime thumbTime = CMTimeMake(t,asset.duration.timescale);
NSValue *v=[NSValue valueWithCMTime:thumbTime];
[cmTimeArray addObject:v];
}
NSLog(@"Array of time %@ count = %d",cmTimeArray, cmTimeArray.count);
//NSLog(@"Array count = %d",cmTimeArray.count);
}
- (void)generateCMTimesArrayOfFrames:(int)framesInterval UsingAsset:(AVURLAsset *)asset
{
int videoDuration = ceilf(((float)asset.duration.value/asset.duration.timescale));
NSLog(@"Video duration %lld seconds timescale = %d",asset.duration.value,asset.duration.timescale);
if (cmTimeArray.count>0) {
[cmTimeArray removeAllObjects];
}
//Generate limited frames present in video
for (int i = 0; i<videoDuration; i++)
{
int64_t tempInt = i;
CMTime tempCMTime = CMTimeMake(tempInt,1);
int32_t interval = framesInterval;
for (int j = 1; j<framesInterval+1; j++)
{
CMTime newCMtime = CMTimeMake(j,interval);
CMTime addition = CMTimeAdd(tempCMTime, newCMtime);
[cmTimeArray addObject:[NSValue valueWithCMTime:addition]];
}
}
NSLog(@"Array of time %@ count = %d",cmTimeArray, cmTimeArray.count);
//NSLog(@"Array count = %d",cmTimeArray.count);
}
- (void)generateThumbnailsFromVideoURL:(AVURLAsset *)videoAsset
{
//Generate CMTimes Array of required frames
//1.Generate All Frames
//[self generateCMTimesArrayOfAllFramesUsingAsset:asset];
//2.Generate specific frames per second
[self generateCMTimesArrayOfFrames:30 UsingAsset:videoAsset];
__block int i = 0;
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result == AVAssetImageGeneratorSucceeded) {
[framesArray addObject:[UIImage imageWithCGImage:im]];
}
if (result == AVAssetImageGeneratorFailed) {
NSLog(@"Failed with error: %@ code %d", [error localizedDescription],error.code);
}
if (result == AVAssetImageGeneratorCancelled) {
NSLog(@"Canceled");
}
i++;
imageIndex = i;
if(i == cmTimeArray.count) {
//Thumbnail generation completed
}
};
// Launching the process...
self.generator = [[AVAssetImageGenerator alloc] initWithAsset:videoAsset];
self.generator.apertureMode = AVAssetImageGeneratorApertureModeCleanAperture;
self.generator.appliesPreferredTrackTransform=TRUE;
self.generator.requestedTimeToleranceBefore = kCMTimeZero;
self.generator.requestedTimeToleranceAfter = kCMTimeZero;
self.generator.maximumSize = CGSizeMake(40, 40);
[self.generator generateCGImagesAsynchronouslyForTimes:cmTimeArray completionHandler:handler];
}
这篇关于iOS AVFoundation - 以60 fps的速度将视频转换为图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!