nev*_*vyn 4 core-image core-video core-media ios replaykit
我收到了CMSampleBufferRef来自系统 API 的CVPixelBufferRefs,其中包含不是RGBA(线性像素)的s 。缓冲区包含平面像素(例如420faka kCVPixelFormatType_420YpCbCr8BiPlanarVideoRangeaka yCbCraka YUV)。
我想在将视频数据发送VideoToolkit到编码之前对其进行一些修改h264(绘制一些文本、覆盖徽标、旋转图像等),但我希望它高效且真实-时间。Buuuut 平面图像数据处理起来看起来很杂乱——有色度平面和亮度平面,它们的大小不同,而且……在字节级别上使用它似乎需要做很多工作。
我可能会使用 aCGContextRef并直接在像素顶部绘制,但据我所知,它仅支持 RGBA 像素。关于如何以尽可能少的数据复制和尽可能少的代码行来做到这一点的任何建议?
CGBitmapContextRef只能画成类似的东西32ARGB,正确。这意味着您需要创建ARGB(或RGBA)缓冲区,然后找到一种方法将YUV像素非常快速地转移到该ARGB表面上。这个秘籍包括使用CoreImage,一个CVPixelBufferRef通过池自制的,一个CGBitmapContextRef引用你自制的像素缓冲区,然后重新创建一个CMSampleBufferRef类似于你的输入缓冲区,但引用你的输出像素。换句话说,
CIImage.CVPixelBufferPool与您正在创建的像素格式和输出尺寸。你不想在CVPixelBuffer没有池的情况下实时创建s:如果你的生产者太快就会耗尽内存;您将不会重复使用缓冲区,因此您将碎片化您的 RAM;这是浪费周期。CIContext使用您将在缓冲区之间共享的默认构造函数创建一个。它不包含外部状态,但文档说在每一帧上重新创建它是非常昂贵的。CMVideoFormatDescriptionRef通过询问像素缓冲区它的确切格式这里有一个简单的实现,在这里我选择32ARGB作为图像格式进行工作,因为在这个时候,双方CGBitmapContext并CoreVideo享有与iOS上的工作:
{
CGPixelBufferPoolRef *_pool;
CGSize _poolBufferDimensions;
}
- (void)_processSampleBuffer:(CMSampleBufferRef)inputBuffer
{
// 1. Input data
CVPixelBufferRef inputPixels = CMSampleBufferGetImageBuffer(inputBuffer);
CIImage *inputImage = [CIImage imageWithCVPixelBuffer:inputPixels];
// 2. Create a new pool if the old pool doesn't have the right format.
CGSize bufferDimensions = {CVPixelBufferGetWidth(inputPixels), CVPixelBufferGetHeight(inputPixels)};
if(!_pool || !CGSizeEqualToSize(bufferDimensions, _poolBufferDimensions)) {
if(_pool) {
CFRelease(_pool);
}
OSStatus ok0 = CVPixelBufferPoolCreate(NULL,
NULL, // pool attrs
(__bridge CFDictionaryRef)(@{
(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32ARGB),
(id)kCVPixelBufferWidthKey: @(bufferDimensions.width),
(id)kCVPixelBufferHeightKey: @(bufferDimensions.height),
}), // buffer attrs
&_pool
);
_poolBufferDimensions = bufferDimensions;
assert(ok0 == noErr);
}
// 4. Create pixel buffer
CVPixelBufferRef outputPixels;
OSStatus ok1 = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(NULL,
_pool,
(__bridge CFDictionaryRef)@{
// Opt to fail buffer creation in case of slow buffer consumption
// rather than to exhaust all memory.
(__bridge id)kCVPixelBufferPoolAllocationThresholdKey: @20
}, // aux attributes
&outputPixels
);
if(ok1 == kCVReturnWouldExceedAllocationThreshold) {
// Dropping frame because consumer is too slow
return;
}
assert(ok1 == noErr);
// 5, 6. Graphics context to draw in
CGColorSpaceRef deviceColors = CGColorSpaceCreateDeviceRGB();
OSStatus ok2 = CVPixelBufferLockBaseAddress(outputPixels, 0);
assert(ok2 == noErr);
CGContextRef cg = CGBitmapContextCreate(
CVPixelBufferGetBaseAddress(outputPixels), // bytes
CVPixelBufferGetWidth(inputPixels), CVPixelBufferGetHeight(inputPixels), // dimensions
8, // bits per component
CVPixelBufferGetBytesPerRow(outputPixels), // bytes per row
deviceColors, // color space
kCGImageAlphaPremultipliedFirst // bitmap info
);
CFRelease(deviceColors);
assert(cg != NULL);
// 7
[_imageContext render:inputImage toCVPixelBuffer:outputPixels];
// 8. DRAW
CGContextSetRGBFillColor(cg, 0.5, 0, 0, 1);
CGContextSetTextDrawingMode(cg, kCGTextFill);
NSAttributedString *text = [[NSAttributedString alloc] initWithString:@"Hello world" attributes:NULL];
CTLineRef line = CTLineCreateWithAttributedString((__bridge CFAttributedStringRef)text);
CTLineDraw(line, cg);
CFRelease(line);
// 9. Unlock and stop drawing
CFRelease(cg);
CVPixelBufferUnlockBaseAddress(outputPixels, 0);
// 10. Timings
CMSampleTimingInfo timingInfo;
OSStatus ok4 = CMSampleBufferGetSampleTimingInfo(inputBuffer, 0, &timingInfo);
assert(ok4 == noErr);
// 11. VIdeo format
CMVideoFormatDescriptionRef videoFormat;
OSStatus ok5 = CMVideoFormatDescriptionCreateForImageBuffer(NULL, outputPixels, &videoFormat);
assert(ok5 == noErr);
// 12. Output sample buffer
CMSampleBufferRef outputBuffer;
OSStatus ok3 = CMSampleBufferCreateForImageBuffer(NULL, // allocator
outputPixels, // image buffer
YES, // data ready
NULL, // make ready callback
NULL, // make ready refcon
videoFormat,
&timingInfo, // timing info
&outputBuffer // out
);
assert(ok3 == noErr);
[_consumer consumeSampleBuffer:outputBuffer];
CFRelease(outputPixels);
CFRelease(videoFormat);
CFRelease(outputBuffer);
}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
2720 次 |
| 最近记录: |