bla*_*cos 8 video objective-c avfoundation ios avmutablecomposition
这个问题之前已被多次询问,但没有任何帮助.我正在使用合并多个视频AVMutableComposition.合并视频后,我会在30%到40%的视频中出现空白帧.其他合并很好.我刚玩的组成直接使用AVPlayer作为AVPlayerItem.代码如下:
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
AVAssetTrack *assetTrack;
assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
ofTrack:assetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"asset url :: %@",assetTrack.asset);
NSLog(@"Error - %@", error.debugDescription);
}
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"Error - %@", error.debugDescription);
}
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]];
[instructions addObject:videoCompositionInstruction];
time = CMTimeAdd(time, assetTrack.timeRange.duration);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;;
}
}
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = instructions;
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
mutableVideoComposition.renderSize = size;
playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition];
playerItem.videoComposition = mutableVideoComposition;
Run Code Online (Sandbox Code Playgroud)
\n据我所知,AVMutableVideoCompositionLayerInstruction不能简单地“附加”或“添加”作为您的代码方式。
从您的代码来看,我猜您想在合并视频资源时保留视频指令信息,但无法直接“复制”指令。
\n\n如果您想执行此操作,请参阅 的文档AVVideoCompositionLayerInstruction,例如
getTransformRampForTime:startTransform:endTransform:timeRange:\n setTransformRampFromStartTransform:toEndTransform:timeRange:\n setTransform:atTime:\n\n getOpacityRampForTime:startOpacity:endOpacity:timeRange:\n setOpacityRampFromStartOpacity:toEndOpacity:timeRange:\n setOpacity:atTime:\n\n getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange:\n setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange:\n setCropRectangle:atTime:\nRun Code Online (Sandbox Code Playgroud)\n\n您应该在源轨道上使用getFoo...方法,然后计算最终轨道的insertTime或,然后附加到最终视频合成的层指令。timeRangesetFoo...
是的,有点复杂...此外,最重要的是,您无法获得适用于源资产的所有视频效果。
\n\n那么你的目的是什么?您的源资产有什么支持?
\n\n如果您只想合并一些 mp4/mov 文件,只需循环曲目并将它们附加到AVMutableCompositionTrack,不需要videoComposition。我测试了你的代码,它有效。
如果您想合并带有视频说明的 AVAssets,请参阅上面的说明和文档。我的最佳实践是,\n在合并之前,使用 将这些 AVAssets 保存到文件AVAssetExportSession,然后合并视频文件。
ps 也许您的测试文件或源资源存在一些问题。
\n\n我的项目(如 Vine)的代码:
\n\n - (BOOL)generateComposition\n {\n [self cleanComposition];\n\n NSUInteger segmentsCount = self.segmentsCount;\n if (0 == segmentsCount) {\n return NO;\n }\n\n AVMutableComposition *composition = [AVMutableComposition composition];\n AVMutableVideoComposition *videoComposition = nil;\n AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil;\n AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil;\n AVMutableAudioMix *audioMix = nil;\n\n AVMutableCompositionTrack *videoTrack = nil;\n AVMutableCompositionTrack *audioTrack = nil;\n AVMutableCompositionTrack *musicTrack = nil;\n CMTime currentTime = kCMTimeZero;\n\n for (MVRecorderSegment *segment in self.segments) {\n AVURLAsset *asset = segment.asset;\n NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo];\n NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio];\n\n CMTime maxBounds = kCMTimeInvalid;\n\n CMTime videoTime = currentTime;\n for (AVAssetTrack *videoAssetTrack in videoAssetTracks) {\n if (!videoTrack) {\n videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];\n videoTrack.preferredTransform = CGAffineTransformIdentity;\n\n videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];\n videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];\n }\n\n /* Fix orientation */\n CGAffineTransform transform = videoAssetTrack.preferredTransform;\n if (AVCaptureDevicePositionFront == segment.cameraPosition) {\n transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);\n transform = CGAffineTransformScale(transform, -1.0, 1.0);\n } else if (AVCaptureDevicePositionBack == segment.cameraPosition) {\n\n }\n [videoCompositionLayerInstruction setTransform:transform atTime:videoTime];\n\n /* Append track */\n videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds];\n maxBounds = videoTime;\n }\n\n if (self.sessionConfiguration.originalVoiceOn) {\n CMTime audioTime = currentTime;\n for (AVAssetTrack *audioAssetTrack in audioAssetTracks) {\n if (!audioTrack) {\n audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];\n }\n audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds];\n }\n }\n\n currentTime = composition.duration;\n }\n\n if (videoCompositionInstruction && videoCompositionLayerInstruction) {\n videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);\n videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];\n\n videoComposition = [AVMutableVideoComposition videoComposition];\n videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);\n videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);\n videoComposition.instructions = @[videoCompositionInstruction];\n }\n\n\n // \xe6\xb7\xbb\xe5\x8a\xa0\xe8\x83\x8c\xe6\x99\xaf\xe9\x9f\xb3\xe4\xb9\x90 musicTrack\n NSURL *musicFileURL = self.sessionConfiguration.musicFileURL;\n if (musicFileURL && musicFileURL.isFileExists) {\n AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL];\n AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;\n if (musicAssetTrack) {\n musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];\n if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) {\n // \xe5\xa6\x82\xe6\x9e\x9c\xe8\x83\x8c\xe6\x99\xaf\xe9\x9f\xb3\xe4\xb9\x90\xe6\x97\xb6\xe9\x95\xbf\xe5\xa4\xa7\xe4\xba\x8e\xe8\xa7\x86\xe9\xa2\x91\xe6\x80\xbb\xe6\x97\xb6\xe9\x95\xbf, \xe5\x88\x99\xe7\x9b\xb4\xe6\x8e\xa5\xe6\xb7\xbb\xe5\x8a\xa0\n [musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL];\n } else {\n // \xe5\x90\xa6\xe5\x88\x99, \xe5\xbe\xaa\xe7\x8e\xaf\xe8\x83\x8c\xe6\x99\xaf\xe9\x9f\xb3\xe4\xb9\x90\n CMTime musicTime = kCMTimeZero;\n CMTime bounds = composition.duration;\n while (true) {\n musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds];\n if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) {\n break;\n }\n }\n }\n }\n }\n\n // \xe5\xa4\x84\xe7\x90\x86\xe9\x9f\xb3\xe9\xa2\x91\n if (musicTrack) {\n AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack];\n\n /* \xe8\x83\x8c\xe6\x99\xaf\xe9\x9f\xb3\xe4\xb9\x90\xe6\xb7\xbb\xe5\x8a\xa0\xe6\xb7\xa1\xe5\x85\xa5\xe6\xb7\xa1\xe5\x87\xba */\n AVAsset *musicAsset = musicTrack.asset;\n CMTime crossfadeDuration = CMTimeMake(15, 10); // \xe5\x89\x8d\xe5\x90\x8e\xe9\x83\xbd\xe6\x98\xaf1.5\xe7\xa7\x92\n CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5);\n crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration);\n CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration);\n CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration);\n [audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin];\n [audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd];\n\n audioMix = [AVMutableAudioMix audioMix];\n [audioMix setInputParameters:@[audioMixParameters]];\n }\n\n _composition = composition;\n _videoComposition = videoComposition;\n _audioMix = audioMix;\n\n return YES;\n }\n\n\n - (AVPlayerItem *)playerItem\n {\n AVPlayerItem *playerItem = nil;\n if (self.composition) {\n playerItem = [AVPlayerItem playerItemWithAsset:self.composition];\n if (!self.videoComposition.animationTool) {\n playerItem.videoComposition = self.videoComposition;\n }\n playerItem.audioMix = self.audioMix;\n }\n return playerItem;\n }\n\n ///=============================================\n /// MVHelper\n ///=============================================\n\n + (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds\n {\n CMTimeRange timeRange = track.timeRange;\n atTime = CMTimeAdd(atTime, timeRange.start);\n\n if (!track || !compositionTrack) {\n return atTime;\n }\n\n if (CMTIME_IS_VALID(bounds)) {\n CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration);\n if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) {\n timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds)));\n }\n }\n if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) {\n NSError *error = nil;\n [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];\n if (error) {\n MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error);\n }\n return CMTimeAdd(atTime, timeRange.duration);\n }\n\n return atTime;\n }\nRun Code Online (Sandbox Code Playgroud)\n
| 归档时间: |
|
| 查看次数: |
1197 次 |
| 最近记录: |