AVSampleBufferDisplayLayer 未在设备上呈现

luk*_*ura 3 video objective-c avfoundation calayer ios

我正在为 iOS 上的 AVSampleBufferDisplayLayer 苦苦挣扎。我想使用这一层显示一个 CVPixelBuffer,但我无法让它在实际的 iOS 设备上工作。在我的示例应用程序中,我尝试使用以下代码来显示一个颜色像素缓冲区:

@implementation ViewController {
    AVSampleBufferDisplayLayer *videoLayer;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
    videoLayer.frame = CGRectMake(50, 50, 300, 300);
    videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [self.view.layer addSublayer:videoLayer];
}

@implementation ViewController {
    AVSampleBufferDisplayLayer *videoLayer;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
    videoLayer.frame = CGRectMake(50, 50, 300, 300);
    videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [self.view.layer addSublayer:videoLayer];
}

- (void)viewDidAppear:(BOOL)animated {
    [super viewDidAppear:animated];
    [self startVideo];
}

- (void)startVideo {
    [self drawPixelBuffer];
    [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(drawPixelBuffer) userInfo:nil repeats:YES];
}

- (void)drawPixelBuffer {

    int imageSize = 100;

    static const uint8_t pixel[] = {0x00, 0xAA, 0xFF, 0xFF};

    NSMutableData *frame = [NSMutableData data];

    for (int i = 0; i < imageSize * imageSize; i++) {
        [frame appendBytes:pixel length:4];
    }

    CVPixelBufferRef pixelBuffer = NULL;

    CVPixelBufferCreateWithBytes(NULL, imageSize, imageSize, kCVPixelFormatType_32BGRA, [frame bytes], imageSize * 4, NULL, NULL, NULL, &pixelBuffer);

    CMSampleBufferRef sampleBuffer = [self sampleBufferFromPixelBuffer:pixelBuffer];

    if (sampleBuffer) {

        [videoLayer enqueueSampleBuffer:sampleBuffer];
        CFRelease(sampleBuffer);

    }

}

- (CMSampleBufferRef)sampleBufferFromPixelBuffer:(CVPixelBufferRef)pixelBuffer {

    CMSampleBufferRef sampleBuffer = NULL;
    OSStatus err = noErr;
    CMVideoFormatDescriptionRef formatDesc = NULL;
    err = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc);

    if (err != noErr) {
        return nil;
    }

    CMSampleTimingInfo sampleTimingInfo = kCMTimingInfoInvalid;

    err = CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer);

    if (sampleBuffer) {
        CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
        CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
        CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
    }

    if (err != noErr) {
        return nil;
    }

    formatDesc = NULL;

    return sampleBuffer;

}

@end
Run Code Online (Sandbox Code Playgroud)

这在 iOS 模拟器中没有任何问题,但在真实设备上不起作用(没有呈现任何内容)。视频层的错误属性始终为零,状态始终等于 AVQueuedSampleBufferRenderingStatusRendering。

谢谢你的帮助。

Ste*_*her 9

模拟器中的图形实现要健壮得多,通常你可以摆脱在设备上无法运行的东西。有两种常见的原因:

像素缓冲区应由 IOSurface 支持

您直接通过CVPixelBufferCreateWithBytes. 再次尝试使用CVPixelBufferCreatekCVPixelBufferIOSurfacePropertiesKey属性设置为一个空的字典。

CVPixelBufferCreate(
    NULL,
    imageSize,
    imageSize,
    kCVPixelFormatType_32BGRA,
    (__bridge CFDictionaryRef)@{
        (id)kCVPixelBufferIOSurfacePropertiesKey: @{}
    },
    &pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *bytes = CVPixelBufferGetBaseAddress(pixelBuffer);
// Write image data directly to that address
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
Run Code Online (Sandbox Code Playgroud)

实际上,生成这些像素的任何东西都应该尽可能直接写入 a CVPixelBufferRef

设备将不支持某些格式

对于 来说kCVPixelFormatType_32BGRA,这似乎极不可能,但我已经看到其他软件仅支持模拟器,例如kCVPixelFormatType_422YpCbCr8. 在这些情况下,必须首先将其转换为兼容格式,或者必须实现自定义渲染器(OpenGL、Metal 等)。