AVFoundation - 重新定时CMSampleBufferRef视频输出

Dar*_*ely 11 iphone cocoa cocoa-touch avfoundation

第一次在这里问一个问题.我希望帖子清晰,示例代码格式正确.

我正在尝试AVFoundation和时间推移摄影.

我的目的是从iOS设备(我的iPod touch,版本4)的摄像机中抓取每个第N帧,并将每个帧写入文件以创建游戏中时光倒流.我正在使用AVCaptureVideoDataOutput,AVAssetWriter和AVAssetWriterInput.

问题是,如果我使用CMSampleBufferRef传递给

captureOutput:idOutputSampleBuffer:fromConnection:
,每帧的回放是原始输入帧之间的时间长度.帧率为1fps.我希望得到30fps.

我试过用了

CMSampleBufferCreateCopyWithNewTiming()
,但随后将13帧写入文件后,
captureOutput:idOutputSampleBuffer:fromConnection:
停止被叫.界面处于活动状态,我可以点击按钮停止捕获并将其保存到照片库进行播放.它看起来像我想要的那样回放,30fps,但它只有那13帧.

如何实现30fps播放的目标?如何判断应用程序丢失的原因以及原因?

我已经放置了一个名为useNativeTime的标志,因此我可以测试这两种情况.当设置为YES时,我得到所有我感兴趣的帧,因为回调不会"迷路".当我将该标志设置为NO时,我只处理了13帧,并且再也没有返回到该方法.如上所述,在这两种情况下我都可以播放视频.

谢谢你的帮助.

这是我正在尝试重新定时的地方.

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    BOOL useNativeTime = NO;
    BOOL appendSuccessFlag = NO;

    //NSLog(@"in captureOutpput sample buffer method");
    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        //CMSampleBufferInvalidate(sampleBuffer);
        return;
    }

    if (! [inputWriterBuffer isReadyForMoreMediaData])
    {
        NSLog(@"Not ready for data.");
    }
    else {
        // Write every first frame of n frames (30 native from camera). 
        intervalFrames++;
        if (intervalFrames > 30) {
            intervalFrames = 1;
        }
        else if (intervalFrames != 1) {
            //CMSampleBufferInvalidate(sampleBuffer);
            return;
        }

        // Need to initialize start session time.
        if (writtenFrames < 1) {
            if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
            [outputWriter startSessionAtSourceTime: imageSourceTime];
            NSLog(@"Starting CMtime");
            CMTimeShow(imageSourceTime);
        }

        if (useNativeTime) {
            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTimeShow(imageSourceTime);
            // CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
            // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
        }
        else {
            CMSampleBufferRef newSampleBuffer;
            CMSampleTimingInfo sampleTimingInfo;
            sampleTimingInfo.duration = CMTimeMake(20,600);
            sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
            sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
            OSStatus myStatus;

            //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
            myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                             sampleBuffer,
                                                             1,
                                                             &sampleTimingInfo, // maybe a little confused on this param.
                                                             &newSampleBuffer);
            // These confirm the good heath of our newSampleBuffer.
            if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
            if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!");

            // No affect.
            //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer);  // How is this different; CMSampleBufferSetDataReady ?
            //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);

            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
            CMTimeShow(imageSourceTime);
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
            //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
            //CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
        }

        if (!appendSuccessFlag)
        {
            NSLog(@"Failed to append pixel buffer");
        }
        else {
            writtenFrames++;
            NSLog(@"writtenFrames: %i", writtenFrames);
            }
    }

    //[self displayOuptutWritterStatus];    // Expect and see AVAssetWriterStatusWriting.
}
Run Code Online (Sandbox Code Playgroud)

我的安装程序.

    - (IBAction) recordingStartStop: (id) sender
{
    NSError * error;

    if (self.isRecording) {
        NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
        self.isRecording = NO;
        [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal];

        //[self.captureSession stopRunning];
        [inputWriterBuffer markAsFinished];
        [outputWriter endSessionAtSourceTime:imageSourceTime];
        [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
        NSLog(@"finished CMtime");
        CMTimeShow(imageSourceTime);

        // Really, I should loop through the outputs and close all of them or target specific ones.
        // Since I'm only recording video right now, I feel safe doing this.
        [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];

        [videoOutput release];
        [inputWriterBuffer release];
        [outputWriter release];
        videoOutput = nil;
        inputWriterBuffer = nil;
        outputWriter = nil;
        NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
        NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
        NSLog(@"filePath: %@", [projectPaths movieFilePath]);
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
            NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
            UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
        }
        NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
    }
    else {
        NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
        projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"];
        intervalFrames = 30;

        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
        NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
        [cameraVideoSettings setValue: value forKey: key];
        [videoOutput setVideoSettings: cameraVideoSettings];
        [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
        [videoOutput setAlwaysDiscardsLateVideoFrames: YES];

        queue = dispatch_queue_create("cameraQueue", NULL);
        [videoOutput setSampleBufferDelegate: self queue: queue];
        dispatch_release(queue);

        NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
        [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
        [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];

        NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
        //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
        [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

        inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
        [inputWriterBuffer retain];
        inputWriterBuffer.expectsMediaDataInRealTime = YES;

        outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
        [outputWriter retain];

        if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
        if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
        else NSLog(@"can not add input");

        if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported");

        if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
        else NSLog(@"could not addOutput: videoOutput to captureSession");

        //[self.captureSession startRunning];
        self.isRecording = YES;
        [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal];

        writtenFrames = 0;
        imageSourceTime = kCMTimeZero;
        [outputWriter startWriting];
        //[outputWriter startSessionAtSourceTime: imageSourceTime];
        NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
        NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]);
    }

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO");

    [self displayOuptutWritterStatus];  
}
Run Code Online (Sandbox Code Playgroud)

Dar*_*ely 10

好的,我在第一篇文章中发现了这个错误.

使用时

myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                 sampleBuffer,
                                                 1,
                                                 &sampleTimingInfo, 
                                                 &newSampleBuffer);
Run Code Online (Sandbox Code Playgroud)

你需要与a平衡 CFRelease(newSampleBuffer);

将CVPixelBufferRef与AVAssetWriterInputPixelBufferAdaptor实例的piexBufferPool一起使用时,同样的想法也适用.您将CVPixelBufferRelease(yourCVPixelBufferRef);在调用该appendPixelBuffer: withPresentationTime:方法后使用.

希望这对其他人有帮助.


Dar*_*ely 3

通过更多的搜索和阅读,我有了一个可行的解决方案。不知道这是最好的方法,但到目前为止,还不错。

在我的设置区域中,我设置了一个 AVAssetWriterInputPixelBufferAdaptor。添加的代码如下所示。

InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
            assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
            sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
Run Code Online (Sandbox Code Playgroud)

为了完整地理解下面的代码,我还在设置方法中添加了这三行。

fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Run Code Online (Sandbox Code Playgroud)

而不是对样本缓冲区的副本应用重定时。我现在有以下三行代码,可以有效地执行相同的操作。请注意适配器的 withPresentationTime 参数。通过将我的自定义值传递给它,我获得了我正在寻找的正确时机。

CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Run Code Online (Sandbox Code Playgroud)

使用 AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool 属性可能会有一些好处,但我还没有弄清楚。