问题描述
在对WWDC2014,Session513进行详细审查之后,我尝试在IOS8.0上编写我的应用程序来解码并显示一个实时H.264流。首先,我成功构建了一个H264参数集。当我得到一个带有4位起始码的帧时,就像0x00 0x00 0x00 0x01 0x65 ......,我把它放入CMblockBuffer。然后我使用预览CMBlockBuffer构建一个CMSampleBuffer。之后,我将CMSampleBuffer放入AVSampleBufferDisplayLayer。一切都很好(我检查了返回的值),但AVSampleBufferDisplayLayer没有显示任何视频图像。由于这些API对每个人来说都是相当新的,我找不到任何可以解决这个问题的机构。
After having a detail review of WWDC2014,Session513, I try to write my app on IOS8.0 to decode and display one live H.264 stream. First of all, I construct a H264 parameter set successfully. When I get one I frame with a 4 bit start code,just like"0x00 0x00 0x00 0x01 0x65 ...", I put it into a CMblockBuffer. Then I construct a CMSampleBuffer using previews CMBlockBuffer. After that,I put the CMSampleBuffer into a AVSampleBufferDisplayLayer. Everything is OK(I checked the value returned ) except the AVSampleBufferDisplayLayer does not show any video image. Since these APIs are fairly new to everyone, I couldn't find any body who can resolve this problem.
我将按照以下方式给出密钥代码,我这样做非常感谢,如果你能帮助弄清楚为什么无法显示视频图像。非常感谢。
I'll give the key codes as follows,and I do really appreciate it if you can help to figure out why the vide image can't be displayed. Thanks a lot.
(1)初始化AVSampleBufferDisplayLayer。
dsplayer是我的主视图控制器的objc实例。
(1) AVSampleBufferDisplayLayer initialised. dsplayer is a objc instance of my main view controller.
@property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer;
if(!_dspLayer)
{
_dspLayer = [[AVSampleBufferDisplayLayer alloc]init];
[_dspLayer setFrame:CGRectMake(90,551,557,389)];
_dspLayer.videoGravity = AVLayerVideoGravityResizeAspect;
_dspLayer.backgroundColor = [UIColor grayColor].CGColor;
CMTimebaseRef tmBase = nil;
CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase);
_dspLayer.controlTimebase = tmBase;
CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero);
CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0);
[self.view.layer addSublayer:_dspLayer];
}
(2)在另一个帖子中,我得到一个H. 264我帧。
//构造h.264参数设置ok
(2)In another thread, I get one H.264 I frame. //construct h.264 parameter set ok
CMVideoFormatDescriptionRef formatDesc;
OSStatus formatCreateResult =
CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc);
NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);
//构造cmBlockbuffer。
// databuf指向H.264数据。以0x00 0x00 0x00 0x01 0x65 ........开头。
//construct cmBlockbuffer .//databuf points to H.264 data. starts with "0x00 0x00 0x00 0x01 0x65 ........"
CMBlockBufferRef blockBufferOut = nil;
CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut);
CMBlockBufferAppendMemoryBlock(blockBufferOut,
dataBuf,
dataLen,
NULL,
NULL,
0,
dataLen,
kCMBlockBufferAlwaysCopyDataFlag);
//构建cmsamplebuffer ok
//construct cmsamplebuffer ok
size_t sampleSizeArray[1] = {0};
sampleSizeArray[0] = CMBlockBufferGetDataLength(blockBufferOut);
CMSampleTiminginfo tmInfos[1] = {
{CMTimeMake(5,1), CMTimeMake(5,1), CMTimeMake(5,1)}
};
CMSampleBufferRef sampBuf = nil;
formatCreateResult = CMSampleBufferCreate(kCFAllocatorDefault,
blockBufferOut,
YES,
NULL,
NULL,
formatDesc,
1,
1,
tmInfos,
1,
sampleSizeArray,
&sampBuf);
//放入AVSampleBufferdisplayLayer,只需一帧。但我在视图中看不到任何视频帧
//put to AVSampleBufferdisplayLayer,just one frame. But I can't see any video frame in my view
if([self.dspLayer isReadyForMoreMediaData])
{
[self.dspLayer enqueueSampleBuffer:sampBuf];
}
[self.dspLayer setNeedsDisplay];
推荐答案
您的NAL单元起始码0x00 0x00 0x01或0x00 0x00 0x00 0x01需要用长度标头替换。
Your NAL unit start codes 0x00 0x00 0x01 or 0x00 0x00 0x00 0x01 need to be replaced by a length header.
在WWDC会议中明确指出,附件B起始码需要由符合AVCC标准的lengh标头替换。您基本上是在附件B流格式中重新转换为MP4文件格式(当然是简化说明)。
This was clearly stated in the WWDC session you are referring to that the Annex B start code needs to be replaced by a AVCC conform lengh header. You are basically remuxing to MP4 file format from Annex B stream format on the fly here (simplified description of course).
创建参数集时的调用是4为此,您需要为VCL NAL单元添加前缀为4字节的前缀。这就是为什么你用AVCC格式指定它的长度标题可以更短。
Your call when creating the Parameter Set is "4" for this, so you need to prefix your VCL NAL units with a 4 byte length prefix. That's why you specifiy it as in AVCC format the length header can be shorter.
无论你放在CMSampleBuffer里面什么都没关系,如果内容可以是没有健全性检查解码,只是你满足了任意数据与时间信息和参数集相结合所需的参数。
Whatever you put inside CMSampleBuffer will be OK, there is no sanity check if the contents can be decoded, just that you met the required parameters for being arbitrary data combined with timing information and a parameter set.
基本上你输入的数据就是VCL NAL单位长度为1个字节。解码器没有获得完整的NAL单元并且在出错时退出。
Basically with the data you put in you said the the VCL NAL units are 1 byte long. The decoder doesn't get the full NAL unit and bails out on an error.
还要确保在使用时创建参数集,PPS / SPS不会添加了一个长度并且附件B的开始代码也被剥离。
Also make sure that when you use create the parameter set that the PPS/SPS do not have a length byted added and that the Annex B start code is also stripped.
另外我建议不要使用AVSampleBufferDisplayLayer但要经过VTDecompressionSession,这样你就可以做类似的事情了颜色校正或像素着色器内部需要的其他东西。
Also I recommend not to use AVSampleBufferDisplayLayer but go through a VTDecompressionSession, so you can do stuff like color correction or other things that are needed inside a pixel shader.
这篇关于将H.264 I帧放到AVSampleBufferDisplayLayer但不显示视频图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!