标签: cmsamplebufferref

iOS - 缩放和裁剪CMSampleBufferRef/CVImageBufferRef

我正在使用AVFoundation并从中获取样本缓冲区AVCaptureVideoDataOutput,我可以使用以下命令将其直接写入videoWriter:

- (void)writeBufferFrame:(CMSampleBufferRef)sampleBuffer {
    CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);    
    if(self.videoWriter.status != AVAssetWriterStatusWriting)
    {
        [self.videoWriter startWriting];
        [self.videoWriter startSessionAtSourceTime:lastSampleTime];
    }

    [self.videoWriterInput appendSampleBuffer:sampleBuffer];

}
Run Code Online (Sandbox Code Playgroud)

我现在要做的是在CMSampleBufferRef中裁剪和缩放图像,而不将其转换为UIImage或CGImageRef,因为这会降低性能.

iphone objective-c avfoundation ios cmsamplebufferref

22
推荐指数
3
解决办法
2万
查看次数

音频CMSampleBuffer的深层复制

我正在尝试创建一个CMSampleBuffer的副本,该副本由captureOutput在a中返回AVCaptureAudioDataOutputSampleBufferDelegate.

我遇到的问题是,captureOutput:didOutputSampleBuffer:fromConnection:在我保留CFArray了很长一段时间之后,来自委托方法的框架被丢弃了.

显然,我需要创建传入缓冲区的深层副本以供进一步处理.我也知道CMSampleBufferCreateCopy只会创建浅拷贝.

SO上提出的相关问题很少:

但它们都没有帮助我使用12个参数正确使用CMSampleBufferCreate函数:

  CMSampleBufferRef copyBuffer;

  CMBlockBufferRef data = CMSampleBufferGetDataBuffer(sampleBuffer);
  CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
  CMItemCount itemCount = CMSampleBufferGetNumSamples(sampleBuffer);

  CMTime duration = CMSampleBufferGetDuration(sampleBuffer);
  CMTime presentationStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
  CMSampleTimingInfo timingInfo;
  timingInfo.duration = duration;
  timingInfo.presentationTimeStamp = presentationStamp;
  timingInfo.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer);


  size_t sampleSize = CMBlockBufferGetDataLength(data);
  CMBlockBufferRef sampleData;

  if (CMBlockBufferCopyDataBytes(data, 0, sampleSize, &sampleData) != kCMBlockBufferNoErr) {
    VLog(@"error during copying sample buffer");
  }

  // Here I tried data and sampleData CMBlockBuffer …
Run Code Online (Sandbox Code Playgroud)

objective-c avfoundation ios cmsamplebufferref swift

14
推荐指数
1
解决办法
1691
查看次数

如何将CMSampleBufferRef转换为NSData

如何将CMSampleBufferRef转换为NSData?

我已经设法MPMediaItem通过跟踪Erik Aigner在这个线程上的答案来获取数据,但数据是类型的CMSampleBufferRef.

我知道CMSampleBufferRef是一个结构,在iOS Dev Library 的CMSampleBuffer Reference中定义,但我不认为我完全理解它是什么.CMSampleBuffer函数似乎都不是一个明显的解决方案.

objective-c nsdata ios cmsamplebufferref

11
推荐指数
1
解决办法
1万
查看次数

如何从iOS中的CMSampleBufferRef获取当前捕获的Camera数据时间戳

我开发了iOS应用程序,它将捕获的摄像机数据保存到文件中并使用

(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Run Code Online (Sandbox Code Playgroud)

捕获CMSampleBufferRef,这将编码为H264格式,帧将使用AVAssetWriter保存到文件中.

我按照示例源代码创建了这个应用程序:

http://www.gdcl.co.uk//2013/02/20/iOS-Video-Encoding.html

现在我想获取已保存视频帧的时间戳来创建一个新的电影文件,

为此,我做了以下事情

1)找到文件并创建AVAssestReader读取文件

CMSampleBufferRef sample = [asset_reader_output copyNextSampleBuffer];

        CMSampleBufferRef buffer;
        while ( [assestReader status]==AVAssetReaderStatusReading ){
            buffer = [asset_reader_output copyNextSampleBuffer];

            //CMSampleBufferGetPresentationTimeStamp(buffer);

            CMTime presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(buffer);
            UInt32 timeStamp = (1000*presentationTimeStamp.value) / presentationTimeStamp.timescale;

            NSLog(@"timestamp %u",(unsigned int)timeStamp);


            NSLog(@"reading");
        //    CFRelease(buffer);
Run Code Online (Sandbox Code Playgroud)

打印值给我一个错误的时间戳,我需要获得帧的捕获时间.

有没有办法获得帧捕获的时间戳,

我已阅读以下链接以获取时间戳,但它没有正确阐述上面的问题 如何为AVWriter写入设置CMSampleBuffer的时间戳

更新

我在写入文件之前读取了样本时间戳,它给了我一个xxxxx值(33333.23232)

在我试图读取文件后,它给了我不同的价值,这有什么具体的原因?

objective-c avfoundation ios cmsamplebufferref

11
推荐指数
1
解决办法
5123
查看次数

从CVPixelBuffer参考中获取所需数据

我有一个程序可以实时查看摄像机输入并获取中间像素的颜色值.我使用captureOutput:方法从AVCaptureSession输出中获取CMSampleBuffer(恰好读作CVPixelBuffer),然后使用以下代码获取像素的rgb值:

// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0); 

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 
unsigned char* pixel = (unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer);

NSLog(@"Middle pixel: %hhu", pixel[((width*height)*4)/2]);
int red = pixel[(((width*height)*4)/2)+2];
int green = pixel[(((width*height)*4)/2)+1];
int …
Run Code Online (Sandbox Code Playgroud)

iphone ios avcapturesession cmsamplebufferref cmsamplebuffer

10
推荐指数
1
解决办法
6706
查看次数

将AudioBufferList转换为CMBlockBufferRef时出错

我正在尝试使用AVAssetReader读取视频文件并将音频传递给CoreAudio进行处理(添加效果和内容),然后使用AVAssetWriter将其保存回磁盘.我想指出,如果我将输出节点的AudioComponentDescription上的componentSubType设置为RemoteIO,那么通过扬声器可以正常播放.这使我确信我的AUGraph设置正确,因为我可以听到工作正常.我将subType设置为GenericOutput,因此我可以自己进行渲染并获取调整后的音频.

我正在读取音频,我将CMSampleBufferRef传递给copyBuffer.这会将音频放入循环缓冲区,稍后将读取.

- (void)copyBuffer:(CMSampleBufferRef)buf {  
    if (_readyForMoreBytes == NO)  
    {  
        return;  
    }  

    AudioBufferList abl;  
    CMBlockBufferRef blockBuffer;  
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(buf, NULL, &abl, sizeof(abl), NULL, NULL, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &blockBuffer);  

    UInt32 size = (unsigned int)CMSampleBufferGetTotalSampleSize(buf);  
    BOOL bytesCopied = TPCircularBufferProduceBytes(&circularBuffer, abl.mBuffers[0].mData, size);  

    if (!bytesCopied){  
        /  
        _readyForMoreBytes = NO;  

        if (size > kRescueBufferSize){  
            NSLog(@"Unable to allocate enought space for rescue buffer, dropping audio frame");  
        } else {  
            if (rescueBuffer == nil) {  
                rescueBuffer = malloc(kRescueBufferSize);  
            }  

            rescueBufferSize = size;  
            memcpy(rescueBuffer, abl.mBuffers[0].mData, size);  
        }  
    }  

    CFRelease(blockBuffer);  
    if (!self.hasBuffer && …
Run Code Online (Sandbox Code Playgroud)

core-audio ios cmsamplebufferref audiobufferlist

10
推荐指数
1
解决办法
828
查看次数

将AVAssetWriter与原始NAL单元一起使用

我iOS的文档中发现了AVAssetWriterInput可以传递niloutputSettings字典来指定输入数据不应该被重新编码.

用于编码附加到输出的媒体的设置.传递nil以指定不应重新编码附加的样本.

我想利用这个功能传递原始H.264 NAL流,但是我无法将原始字节流调整为CMSampleBuffer可以传递到AVAssetWriterInput的appendSampleBuffer方法.我的NAL流只包含SPS/PPS/IDR/P NAL(1,5,7,8).我无法找到有关如何使用AVAssetWriter预编码H264数据的文档或结论性答案.生成的视频文件无法播放.

如何正确打包NAL单元CMSampleBuffers?我需要使用开始代码前缀吗?长度前缀?我需要确保每个只放一个NAL CMSampleBuffer吗?我的最终目标是使用H264/AAC创建MP4或MOV容器.

这是我一直在玩的代码:

-(void)addH264NAL:(NSData *)nal
{
    dispatch_async(recordingQueue, ^{
        //Adapting the raw NAL into a CMSampleBuffer
        CMSampleBufferRef sampleBuffer = NULL;
        CMBlockBufferRef blockBuffer = NULL;
        CMFormatDescriptionRef formatDescription = NULL;
        CMItemCount numberOfSampleTimeEntries = 1;
        CMItemCount numberOfSamples = 1;


        CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, 480, 360, nil, &formatDescription);
        OSStatus result = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, [nal length], kCFAllocatorDefault, NULL, 0, [nal length], kCMBlockBufferAssureMemoryNowFlag, &blockBuffer);
        if(result != noErr)
        {
            NSLog(@"Error creating CMBlockBuffer"); …
Run Code Online (Sandbox Code Playgroud)

iphone h.264 ios avassetwriter cmsamplebufferref

8
推荐指数
1
解决办法
2671
查看次数

如何为AVWriter写入设置CMSampleBuffer的时间戳

我正在使用AVFoundation捕获和录制音频.有些问题我不太明白.

基本上我想从AVCaptureSession捕获音频并使用AVWriter编写它,但是我需要在从AVCaptureSession获得的CMSampleBuffer的时间戳中进行一些转换.我阅读了CMSampleBuffer的文档.我看到两个不同的时间戳术语:'presentation timestamp'和'output presentation timestamp'.这两者有什么不同?

假设我从AVCaptureSession获取CMSampleBuffer(用于音频)实例,并且我想使用AVWriter将其写入文件,我应该使用什么函数将CMTime"注入"到缓冲区以便设置它的显示时间戳.结果文件?

谢谢.

audio-recording avfoundation core-media ios cmsamplebufferref

7
推荐指数
2
解决办法
7422
查看次数

CMSampleBufferRef kCMSampleBufferAttachmentKey_TrimDurationAtStart 崩溃

这已经困扰了我一段时间。我有视频转换器将视频转换为“.mp4”格式。但是在某些视频上发生了崩溃,但不是全部。

这是崩溃日志

*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] 
Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay'
Run Code Online (Sandbox Code Playgroud)

这是我的代码:

NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self getVideoName]] stringByAppendingString:@".mp4"]];

AVAssetTrack *videoTrack = [[self.avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize videoSize = videoTrack.naturalSize;
NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];
NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoWriterSettings]; …
Run Code Online (Sandbox Code Playgroud)

objective-c ios cmsamplebufferref

6
推荐指数
1
解决办法
1941
查看次数

如何在解码 video.mp4 时获取 iOS 中每个视频帧的时间戳

场景:
我正在编写一个iOS 应用程序来尝试解码videoFile.mp4. 我使用AVAssetReaderTrackOutputAVAssetReader来解码视频文件中的帧。这非常有效。videoFile.mp4我基本上使用以下核心逻辑来获取每一帧。

代码:

AVAssetReader * videoFileReader;
AVAssetReaderTrackOutput * assetReaderOutput = [videoFileReader.outputs objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [assetReaderOutput copyNextSampleBuffer];
Run Code Online (Sandbox Code Playgroud)

sampleBuffer这里是每个视频帧的缓冲区。

问题:

  • 如何在这里获取每个视频帧的时间戳?
  • 换句话说,更详细的是,我如何sampleBuffer获得每个返回的时间戳copyNextSampleBuffer

PS:请注意,我需要以毫秒
为单位的时间戳。

objective-c decoder ios avassetreader cmsamplebufferref

6
推荐指数
1
解决办法
2037
查看次数