bla*_*cos 5 video objective-c avfoundation ios avassetexportsession
我正在尝试导出AVMutableComposition使用AVAssetExportSession.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = mainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
switch (exporter.status)
{
case AVAssetExportSessionStatusCompleted:
{
NSLog(@"Video Merge SuccessFullt");
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Failed:%@", exporter.error.description);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Canceled:%@", exporter.error);
break;
case AVAssetExportSessionStatusExporting:
NSLog(@"Exporting!");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(@"Waiting");
break;
default:
break;
}
}];
Run Code Online (Sandbox Code Playgroud)
但是出口即使是1分钟的视频也需要30秒左右,考虑到iPad内置摄像头应用程序需要不到2秒的时间.
此外,如果我videoComposition从导出器中删除,时间减少到7秒,这仍然是不好的,考虑到视频长度只有1分钟.那么,我想知道如何将出口时间减少到最低限度?
另外,我想知道,AVAssetExportSession一般需要这么多时间,还是只是我的情况?
更新:合并代码:
AVMutableComposition*mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoCompositionLayerInstruction *videoTrackLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
ofTrack:assetTrack
atTime:time
error:&error];
[videoTrackLayerInstruction setTransform:assetTrack.preferredTransform atTime:time];
if (error) {
NSLog(@"asset url :: %@",assetTrack.asset);
NSLog(@"Error1 - %@", error.debugDescription);
}
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"Error2 - %@", error.debugDescription);
}
time = CMTimeAdd(time, assetTrack.timeRange.duration);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;
}
}
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
mainInstruction.layerInstructions = [NSArray arrayWithObject:videoTrackLayerInstruction];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = size;
Run Code Online (Sandbox Code Playgroud)
我构建了一个将不同视频片段合并在一起的应用程序,我可以有把握地说这就是您的情况。我的视频文件大约有 10 mb,所以它们可能会小一些,但即使有 10、20 个片段,也只需不到一秒的时间即可将它们全部合并在一起。
现在,就实际发生的情况而言,我已根据您的配置检查了我的配置,差异如下:
export.outputFileType = AVFileTypeMPEG4除此之外,它应该是相同的,我无法真正比较它,因为您必须提供有关如何实际创建组合的代码。但有一些事情需要检查:
AVURLAssetPreferPreciseDurationAndTimingKey如果您在创建时使用AVURLAsset并且没有足够的关键帧,实际上可能需要相当长的时间来查找关键帧,因此会减慢速度如果您提供更多信息,我应该能够为您提供更多帮助,但也许其中一些内容会起作用。尝试一下然后回来报告。
希望有一点帮助!
编辑 1:我忘了提及,如果您别无选择,您应该尝试使用FFmpeg库,因为它具有非常高的性能,尽管由于许可原因可能不适合您。
| 归档时间: |
|
| 查看次数: |
2394 次 |
| 最近记录: |