AVFoundation - 反转AVAsset并输出视频文件

And*_*Hin 24 macos objective-c avfoundation ios

我已经看过几次这个问题了,但是他们似乎没有任何合适的答案.

要求是反转并输出视频文件(不仅仅是反向播放),保持与源视频相同的压缩,格式和帧速率.

理想情况下,解决方案能够在内存或缓冲区中完成所有操作,并避免将帧生成到图像文件中(例如:使用AVAssetImageGenerator),然后重新编译它(资源密集,不可靠的计时结果,帧/图像质量从原始,等等.).

-

我的贡献:这仍然没有用,但到目前为止我尝试过的最好:

  • 将示例帧读入一个CMSampleBufferRef[]使用的数组AVAssetReader.
  • 使用相反的顺序将其写回AVAssetWriter.
  • 问题:似乎每个帧的时间都保存在CMSampleBufferRef所以即使向后追加它们也行不通.
  • 接下来,我尝试用反向/镜像帧交换每帧的定时信息.
  • 问题:这会导致未知错误AVAssetWriter.
  • 下一步:我要调查一下 AVAssetWriterInputPixelBufferAdaptor

    - (AVAsset *)assetByReversingAsset:(AVAsset *)asset {
        NSURL *tmpFileURL = [NSURL URLWithString:@"/tmp/test.mp4"];    
        NSError *error;
    
        // initialize the AVAssetReader that will read the input asset track
        AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
        AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];
    
        AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:nil];
        [reader addOutput:readerOutput];
        [reader startReading];
    
        // Read in the samples into an array
        NSMutableArray *samples = [[NSMutableArray alloc] init];
    
        while(1) {
            CMSampleBufferRef sample = [readerOutput copyNextSampleBuffer];
    
            if (sample == NULL) {
                break;
            }
    
            [samples addObject:(__bridge id)sample];
            CFRelease(sample);
        }
    
        // initialize the the writer that will save to our temporary file.
        CMFormatDescriptionRef formatDescription = CFBridgingRetain([videoTrack.formatDescriptions lastObject]);
        AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:formatDescription];
        CFRelease(formatDescription);
    
        AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:tmpFileURL
                                                          fileType:AVFileTypeMPEG4
                                                             error:&error];
        [writerInput setExpectsMediaDataInRealTime:NO];
        [writer addInput:writerInput];
        [writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[0])];
        [writer startWriting];
    
    
        // Traverse the sample frames in reverse order
        for(NSInteger i = samples.count-1; i >= 0; i--) {
            CMSampleBufferRef sample = (__bridge CMSampleBufferRef)samples[i];
    
            // Since the timing information is built into the CMSampleBufferRef 
            // We will need to make a copy of it with new timing info. Will copy
            // the timing data from the mirror frame at samples[samples.count - i -1]
    
            CMItemCount numSampleTimingEntries;
            CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)samples[samples.count - i -1], 0, nil, &numSampleTimingEntries);
            CMSampleTimingInfo *timingInfo = malloc(sizeof(CMSampleTimingInfo) * numSampleTimingEntries);
            CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)sample, numSampleTimingEntries, timingInfo, &numSampleTimingEntries);
    
            CMSampleBufferRef sampleWithCorrectTiming;
            CMSampleBufferCreateCopyWithNewTiming(
                                                  kCFAllocatorDefault,
                                                  sample,
                                                  numSampleTimingEntries,
                                                  timingInfo,
                                                  &sampleWithCorrectTiming);
    
            if (writerInput.readyForMoreMediaData)  {
                [writerInput appendSampleBuffer:sampleWithCorrectTiming];
            }
    
            CFRelease(sampleWithCorrectTiming);
            free(timingInfo);
        }
    
        [writer finishWriting];
    
        return [AVAsset assetWithURL:tmpFileURL];
    }
    
    Run Code Online (Sandbox Code Playgroud)

And*_*Hin 16

在过去的几天里完成了这项工作并且能够使其正常运行.

源代码在这里:http://www.andyhin.com/post/5/reverse-video-avfoundation

用于AVAssetReader读出样本/帧,提取图像/像素缓冲区,然后将其与镜像帧的显示时间一起附加.

  • 该URL似乎已更改为http://www.andyhin.com/post/5/reverse-video-avfoundation不鼓励使用链接的答案,原因如下... GitHub链接源,可能更少可能会改变,是https://github.com/whydna/ReverseAVAsset (2认同)
  • Swift版本:https://github.com/tempire/ReverseAVAsset/blob/master/AVAsset.swift (2认同)
  • 在NSThread上睡觉不是一个好的解决方案,你应该使用`requestMediaDataWhenReady(on:using :)将你的帧交给你的`AVAssetWriterInput`.请参阅:https://developer.apple.com/reference/avfoundation/avassetwriterinput/1387508-requestmediadatawhenreadyonqueue (2认同)
  • 关于如何反转音频的任何想法? (2认同)