我正在尝试使用Mac OSX Objective-C创建一个命令行工具,该工具将单个H.264和单个AAC文件作为输入(从相同的源 Material 进行编码),并使用AVAssetWriter创建一个兼容QuickTime的单个命令行工具mov(和/或m4v)文件以进行进一步的编辑/分发。
到目前为止,我所做的是:
1)使用AVFoundation Framework组件(即AVAssetWriter,AVAssetWriterInput等)和Core Media Framework组件(即CMSampleBufferCreate,CMBlockBufferReplaceDataBytes等)-我已经为CL工具制作了原型(prototype)。
2)我已经连接了输入/输出文件URL,创建了AVAssetWriter和AVAssetWriterInput,CMBlockBuffer等。
3)当我执行runLoop时,AVASsetWriter会创建m4v文件,尽管格式正确,但它只是一个136字节的文件,代表电影 header 原子,没有视频轨道数据。
4)我已经搜索了StackOverflow以及Apple论坛和整个Internet,以找到针对我的特定问题的答案。
使用我的代码以及Xcode调试器中的错误检查,我发现AVAssetWriter已正确设置-它开始构建电影文件-但CMBlockBufferReplaceDataBytes不会将H.264 NAL数据写入CMBlockBuffer(我相信我应该做)。那我想念什么呢?
这是我的runLoop代码的相关部分:
// Create the videoFile.m4v AVAssetWriter.
AVAssetWriter *videoFileWriter = [[AVAssetWriter alloc] initWithURL:destinationURL fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(videoFileWriter);
if (error) {
NSLog(@"AVAssetWriter initWithURL failed with error= %@", [error localizedDescription]);
}
// Create the video file settings dictionary.
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:1280],
AVVideoWidthKey, [NSNumber numberWithInt:720], AVVideoHeightKey, nil];
// Perform video settings check.
if ([videoFileWriter canApplyOutputSettings:videoSettings forMediaType:AVMediaTypeVideo]) {
NSLog(@"videoFileWriter can apply videoSettings...");
}
// Create the input to the videoFileWriter AVAssetWriter.
AVAssetWriterInput *videoFileWriterInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
videoFileWriterInput.expectsMediaDataInRealTime = YES;
NSParameterAssert(videoFileWriterInput);
NSParameterAssert([videoFileWriter canAddInput:videoFileWriterInput]);
// Connect the videoFileWriterInput to the videoFileWriter.
if ([videoFileWriter canAddInput:videoFileWriterInput]) {
[videoFileWriter addInput:videoFileWriterInput];
}
// Get the contents of videoFile.264 (using current Mac OSX methods).
NSData *sourceData = [NSData dataWithContentsOfURL:sourceURL];
const char *videoFileData = [sourceData bytes];
size_t sourceDataLength = [sourceData length];
NSLog(@"The value of 'sourceDataLength' is: %ld", sourceDataLength);
// Set up to create the videoSampleBuffer.
int32_t videoWidth = 1280;
int32_t videoHeight = 720;
CMBlockBufferRef videoBlockBuffer = NULL;
CMFormatDescriptionRef videoFormat = NULL;
CMSampleBufferRef videoSampleBuffer = NULL;
CMItemCount numberOfSampleTimeEntries = 1;
CMItemCount numberOfSamples = 1;
// More set up to create the videoSampleBuffer.
CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, videoWidth, videoHeight, NULL, &videoFormat);
result = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, 150000, kCFAllocatorDefault, NULL, 0, 150000, kCMBlockBufferAssureMemoryNowFlag,
&videoBlockBuffer);
NSLog(@"After 'CMBlockBufferCreateWithMemoryBlock', 'result' is: %d", result);
// The CMBlockBufferReplaceDataBytes method is supposed to write videoFile.264 data bytes into the videoSampleBuffer.
result = CMBlockBufferReplaceDataBytes(videoFileData, videoBlockBuffer, 0, 150000);
NSLog(@"After 'CMBlockBufferReplaceDataBytes', 'result' is: %d", result);
CMSampleTimingInfo videoSampleTimingInformation = {CMTimeMake(1, 30)};
result = CMSampleBufferCreate(kCFAllocatorDefault, videoBlockBuffer, TRUE, NULL, NULL, videoFormat, numberOfSamples, numberOfSampleTimeEntries,
&videoSampleTimingInformation, 0, NULL, &videoSampleBuffer);
NSLog(@"After 'CMSampleBufferCreate', 'result' is: %d", result);
// Set the videoSampleBuffer to ready (is this needed?).
result = CMSampleBufferMakeDataReady(videoSampleBuffer);
NSLog(@"After 'CMSampleBufferMakeDataReady', 'result' is: %d", result);
// Start writing...
if ([videoFileWriter startWriting]) {
[videoFileWriter startSessionAtSourceTime:kCMTimeZero];
}
// Start the first while loop (DEBUG)...
欢迎所有想法,评论和建议。
谢谢!
最佳答案
问题是对于CMSampleBufferCreate,您传入了零numSampleSizeEntries和NULL sampleSizeArray。如果您没有通过任何内容,那么什么都不会写!