zou*_*oul 14 avfoundation core-video core-media ios
我想将a转换CGImage
为CMSampleBufferRef
并AVAssetWriterInput
使用该appendSampleBuffer:
方法将其附加到a .我已经设法CMSampleBufferRef
使用以下代码,但appendSampleBuffer:
只是NO
在我提供结果时返回CMSampleBufferRef
.我究竟做错了什么?
- (void) appendCGImage: (CGImageRef) frame
{
const int width = CGImageGetWidth(frame);
const int height = CGImageGetHeight(frame);
// Create a dummy pixel buffer to try the encoding
// on something simple.
CVPixelBufferRef pixelBuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width, height,
kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
NSParameterAssert(status == kCVReturnSuccess && pixelBuffer != NULL);
// Sample timing info.
CMTime frameTime = CMTimeMake(1, 30);
CMTime currentTime = CMTimeAdd(lastSampleTime, frameTime);
CMSampleTimingInfo timing = {frameTime, currentTime, kCMTimeInvalid};
OSStatus result = 0;
// Sample format.
CMVideoFormatDescriptionRef videoInfo = NULL;
result = CMVideoFormatDescriptionCreateForImageBuffer(NULL,
pixelBuffer, &videoInfo);
NSParameterAssert(result == 0 && videoInfo != NULL);
// Create sample buffer.
CMSampleBufferRef sampleBuffer = NULL;
result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
NSParameterAssert(result == 0 && sampleBuffer != NULL);
// Ship out the frame.
NSParameterAssert(CMSampleBufferDataIsReady(sampleBuffer));
NSParameterAssert([writerInput isReadyForMoreMediaData]);
BOOL success = [writerInput appendSampleBuffer:frame];
NSParameterAssert(success); // no go :(
}
Run Code Online (Sandbox Code Playgroud)
PS我知道这段代码中存在内存泄漏,为了简单起见,我省略了一些代码.
啊哈,我完全错过AVAssetWriterInputPixelBufferAdaptor
了专门用于将像素缓冲区管道输入到编写器输入中的类.现在代码工作,即使没有杂乱的CMSampleBuffer
东西.
归档时间: |
|
查看次数: |
7282 次 |
最近记录: |