在AVFoundation AVSampleBufferDisplayLayer中循环播放视频

dan*_*dev 5 video objective-c avfoundation ios avplayer

我正在尝试在AVSampleBufferDisplayLayer上循环播放视频。我可以让它播放一次,没有问题。但是,当我尝试循环播放时,它不会继续播放。

根据对AVFoundation的再现视频循环的答案,没有办法倒带AVAssetReader,所以我重新创建了它。(我确实看到了用AVFoundation AVPlayer循环播放视频的答案吗?但是AVPlayer具有更多功能。我正在读取文件,但仍希望AVSampleBufferDisplayLayer。)

一种假设是,我需要停止某些H264标头,但是我不知道这是否有帮助(以及如何帮助)。另一个是它与CMTimebase有关,但是我尝试了几件事都没有用。

以下代码基于Apple的WWDC关于直接访问视频编码的演讲:

- (void)viewDidLoad {
    [super viewDidLoad];

    NSString *filepath = [[NSBundle mainBundle] pathForResource:@"sample-mp4" ofType:@"mp4"];
    NSURL *fileURL = [NSURL fileURLWithPath:filepath];
    AVAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];

    UIView *view = self.view;

    self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
    self.videoLayer.bounds = view.bounds;
    self.videoLayer.position = CGPointMake(CGRectGetMidX(view.bounds), CGRectGetMidY(view.bounds));
    self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];

    CMTimebaseRef controlTimebase;
    CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );

    self.videoLayer.controlTimebase = controlTimebase;
    CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
    CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);

    [[view layer] addSublayer:_videoLayer];

    dispatch_queue_t assetQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); //??? right queue?


    __block AVAssetReader *assetReaderVideo = [self createAssetReader:asset];
    __block AVAssetReaderTrackOutput *outVideo = [assetReaderVideo outputs][0];
    if( [assetReaderVideo startReading] )
    {
        [_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
            while( [_videoLayer isReadyForMoreMediaData] )
            {
                CMSampleBufferRef sampleVideo;
                if ( ([assetReaderVideo status] == AVAssetReaderStatusReading) && ( sampleVideo = [outVideo copyNextSampleBuffer]) ) {
                    [_videoLayer enqueueSampleBuffer:sampleVideo];
                    CFRelease(sampleVideo);
                    CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
                }
                else {

                    [_videoLayer stopRequestingMediaData];
                    //CMTimebaseSetTime(_videoLayer.controlTimebase, CMTimeMake(5, 1));
                    //CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
                    //CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
                    assetReaderVideo = [self createAssetReader:asset];
                    outVideo = [assetReaderVideo outputs][0];
                    [assetReaderVideo startReading];
                    //sampleVideo = [outVideo copyNextSampleBuffer];

                    //[_videoLayer enqueueSampleBuffer:sampleVideo];
                }
            }
        }];
    }
}

-(AVAssetReader *)createAssetReader:(AVAsset*)asset {
    NSError *error=nil;

    AVAssetReader *assetReaderVideo = [[AVAssetReader alloc] initWithAsset:asset error:&error];

    NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
    AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTracks[0] outputSettings:nil]; //dic];
    [outVideo res]

    [assetReaderVideo addOutput:outVideo];
    return assetReaderVideo;
}
Run Code Online (Sandbox Code Playgroud)

非常感谢。

nac*_*n f 0

尝试使用 swift 进行循环,然后将 Objective-C 文件与 swift 文件桥接。谷歌有很多关于桥接和循环的答案,所以只需用 swift 谷歌一下即可。