在注意到iOS 8中的程序员可以使用HW-H264-Decoder之后,我想立即使用它.来自WWDC 2014的"直接访问视频编码和解码"有一个很好的介绍.你可以看看这里.
基于案例1,我开始开发一个应用程序,应该能够从GStreamer获取H264-RTP-UDP-Stream,将其下载到'appsink'元素中,以便直接访问NAL单元并执行转换为创建CMSampleBuffers,然后我的AVSampleBufferDisplayLayer可以显示.
有趣的一段代码完成以下所有操作:
//
//  GStreamerBackend.m
// 
#import "GStreamerBackend.h"
NSString * const naluTypesStrings[] = {
    @"Unspecified (non-VCL)",
    @"Coded slice of a non-IDR picture (VCL)",
    @"Coded slice data partition A (VCL)",
    @"Coded slice data partition B (VCL)",
    @"Coded slice data partition C (VCL)",
    @"Coded slice of an IDR picture (VCL)",
    @"Supplemental enhancement information (SEI) (non-VCL)",
    @"Sequence parameter set (non-VCL)",
    @"Picture parameter set (non-VCL)",
    @"Access unit delimiter (non-VCL)",
    @"End of sequence (non-VCL)",
    @"End of stream (non-VCL)", …在对WWDC2014,Session513进行详细审查之后,我尝试在IOS8.0上编写我的应用程序来解码并显示一个实时H.264流.首先,我成功构建了H264参数集.当我得到一个带有4位起始码的帧时,就像"0x00 0x00 0x00 0x01 0x65 ......",我把它放入CMblockBuffer.然后我使用预览CMBlockBuffer构建一个CMSampleBuffer.之后,我将CMSampleBuffer放入AVSampleBufferDisplayLayer.一切都很好(我检查了返回的值),但AVSampleBufferDisplayLayer没有显示任何视频图像.由于这些API对每个人来说都是相当新的,我找不到任何可以解决此问题的机构.
我将给出如下的密钥代码,如果你能帮助弄清楚无法显示视频图像的原因,我真的很感激.非常感谢.
(1)初始化AVSampleBufferDisplayLayer.dsplayer是我的主视图控制器的objc实例.
    @property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer;
if(!_dspLayer)
{
    _dspLayer = [[AVSampleBufferDisplayLayer alloc]init];
    [_dspLayer setFrame:CGRectMake(90,551,557,389)];
    _dspLayer.videoGravity = AVLayerVideoGravityResizeAspect;
   _dspLayer.backgroundColor = [UIColor grayColor].CGColor;
    CMTimebaseRef tmBase = nil;
    CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase);
    _dspLayer.controlTimebase = tmBase;
    CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero);
    CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0);
     [self.view.layer addSublayer:_dspLayer];
}
(2)在另一个线程中,我得到一个H.264 I帧.//构造h.264参数设置确定
    CMVideoFormatDescriptionRef formatDesc;
    OSStatus formatCreateResult =
    CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc);
    NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);
//构造cmBlockbuffer.// databuf指向H.264数据.以"0x00 0x00 0x00 0x01 0x65 ........"开头
    CMBlockBufferRef blockBufferOut = nil;
    CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut);
    CMBlockBufferAppendMemoryBlock(blockBufferOut,
                                    dataBuf,
                                    dataLen,
                                    NULL,
                                    NULL,
                                    0,
                                    dataLen,
                                    kCMBlockBufferAlwaysCopyDataFlag); …