Swift 合并 AVasset-Videos 数组

mar*_*uis 6 arrays video ios avmutablecomposition avasset

我想将 AVAsset- 合并arrayVideos为一个视频并将其保存在相机胶卷中。Raywenderlich.com 有一个很棒的教程,其中两个视频合并为一个。我创建了以下代码,但是导出到相机胶卷后获得的视频仅包含数组中的第一个和最后一个视频(不包括中间的其余视频arrayVideos)。我在这里错过了什么吗?

var arrayVideos = [AVAsset]() //Videos Array    
var atTimeM: CMTime = CMTimeMake(0, 0)
var lastAsset: AVAsset!
var layerInstructionsArray = [AVVideoCompositionLayerInstruction]()
var completeTrackDuration: CMTime = CMTimeMake(0, 1)
var videoSize: CGSize = CGSize(width: 0.0, height: 0.0)

func mergeVideoArray(){

    let mixComposition = AVMutableComposition()
    for videoAsset in arrayVideos{
        let videoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        do {
            if videoAsset == arrayVideos.first{
                atTimeM = kCMTimeZero
            } else{
                atTimeM = lastAsset!.duration
            }
            try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0], at: atTimeM)  
            videoSize = videoTrack.naturalSize
        } catch let error as NSError {
            print("error: \(error)")
        }
        completeTrackDuration = CMTimeAdd(completeTrackDuration, videoAsset.duration)
        let videoInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
        if videoAsset != arrayVideos.last{
            videoInstruction.setOpacity(0.0, at: videoAsset.duration)
        }
        layerInstructionsArray.append(videoInstruction)
        lastAsset = videoAsset            
    }

    let mainInstruction = AVMutableVideoCompositionInstruction()
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, completeTrackDuration)
    mainInstruction.layerInstructions = layerInstructionsArray        

    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, 30)
    mainComposition.renderSize = CGSize(width: videoSize.width, height: videoSize.height)

    let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
    let dateFormatter = DateFormatter()
    dateFormatter.dateStyle = .long
    dateFormatter.timeStyle = .short
    let date = dateFormatter.string(from: NSDate() as Date)
    let savePath = (documentDirectory as NSString).appendingPathComponent("mergeVideo-\(date).mov")
    let url = NSURL(fileURLWithPath: savePath)

    let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter!.outputURL = url as URL
    exporter!.outputFileType = AVFileTypeQuickTimeMovie
    exporter!.shouldOptimizeForNetworkUse = true
    exporter!.videoComposition = mainComposition
    exporter!.exportAsynchronously {

        PHPhotoLibrary.shared().performChanges({
            PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: exporter!.outputURL!)
        }) { saved, error in
            if saved {
                let alertController = UIAlertController(title: "Your video was successfully saved", message: nil, preferredStyle: .alert)
                let defaultAction = UIAlertAction(title: "OK", style: .default, handler: nil)
                alertController.addAction(defaultAction)
                self.present(alertController, animated: true, completion: nil)
            } else{
                print("video erro: \(error)")

            }
        }
    }
} 
Run Code Online (Sandbox Code Playgroud)

Dan*_*ang 4

您需要跟踪所有资产的总时间并更新每个视频的总时间。

您问题中的代码是atTimeM用当前视频重写的。这就是为什么只包括第一个和最后一个。

它看起来像这样:

...
var totalTime : CMTime = CMTimeMake(0, 0)

func mergeVideoArray() {

    let mixComposition = AVMutableComposition()
    for videoAsset in arrayVideos {
        let videoTrack = 
            mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, 
                                           preferredTrackID: Int32(kCMPersistentTrackID_Invalid))          
        do {
            if videoAsset == arrayVideos.first {
                atTimeM = kCMTimeZero
            } else {
                atTimeM = totalTime // <-- Use the total time for all the videos seen so far.
            }
            try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), 
                                           of: videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0], 
                                           at: atTimeM)  
            videoSize = videoTrack.naturalSize
        } catch let error as NSError {
            print("error: \(error)")
        }
        totalTime += videoAsset.duration // <-- Update the total time for all videos.
...
Run Code Online (Sandbox Code Playgroud)

您可以删除使用lastAsset.

  • 你不能用+=增加totalTime,你需要使用CMTimeAdd。 (2认同)