我正在使用AVEditDemoApple的WWDC 2010样本包中的项目,我正在尝试更改导出视频的帧速率.使用AVMutableCompositionframeDuration设置的视频导出视频:
videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps
Run Code Online (Sandbox Code Playgroud)
出于某种原因,将30更改为25并不会更改使用framerate该视频导出的视频AVAssetExportSession.有谁知道为什么?
iphone avfoundation ios avmutablecomposition avassetexportsession
我不得不将大文件大小的歌曲从iTunes库转换为较小的8K歌曲文件.
当我执行转换异步时,即使写入doc文件夹未完成,bool也总是返回true.目前我正在使用10秒的延迟再次调用该功能,它在iPhone 5s的临时工作正常,但我想迎合较慢的设备.
请在我的代码上给我一些指针/推荐.
-(void)startUploadSongAnalysis
{
[self updateProgressYForID3NForUpload:NO];
if ([self.uploadWorkingAray count]>=1)
{
Song *songVar = [self.uploadWorkingAray objectAtIndex:0];//core data var
NSLog(@"songVar %@",songVar.songName);
NSLog(@"songVar %@",songVar.songURL);
NSURL *songU = [NSURL URLWithString:songVar.songURL]; //URL of iTunes Lib
// self.asset = [AVAsset assetWithURL:songU];
// NSLog(@"asset %@",self.asset);
NSError *error;
NSString *subString = [[songVar.songURL componentsSeparatedByString:@"id="] lastObject];
NSString *savedPath = [self.documentsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"audio%@.m4a",subString]];//save file name of converted 8kb song
NSString *subStringPath = [NSString stringWithFormat:@"audio%@.m4a",subString];
if ([self.fileManager fileExistsAtPath:savedPath] == YES)
[self.fileManager removeItemAtPath:savedPath error:&error];
NSLog(@"cacheDir %@",savedPath);
//export low bitrate song to …Run Code Online (Sandbox Code Playgroud) objective-c nsdocument ios avassetwriter avassetexportsession
我正在开发App,我正在编写视频并导出到目录.它在iOS 7中运行良好,但在iOS 8中失败.
以下是我的代码:
AVAsset *pVideoTrack = [AVAsset assetWithURL:[NSURL fileURLWithPath:assetPath]];
AVVideoComposition *origionalComposition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:pVideoTrack];
AVAssetTrack *clipVideoTrack = [[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, pVideoTrack.duration)
ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
BOOL bIsPotrait = [self checkVideoOrientationProperty:[[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]];
CGSize videoSize = CGSizeMake(0, 0);
if(bIsPotrait)
videoSize = CGSizeMake([[[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height, [[[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width);
else
videoSize = [[[pVideoTrack tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = …Run Code Online (Sandbox Code Playgroud) 出于某种原因,我总是收到此错误:
错误域= NSURLErrorDomain代码= -3000"无法创建文件"UserInfo = {NSLocalizedDescription =无法创建文件,NSUnderlyingError = 0x1321dd730 {错误域= NSOSStatusErrorDomain代码= -12115"(null)"}}
尝试将AVSession导出到m4a时.这在我的同事设备上运行良好,但每次在我的iPad Air 2(iOS 9.1)以及我们的QA iPad Mini 3上都会失败.
- (void)processSourceVideoFile:(NSURL *)mediaURL completion:(void (^)(BOOL success))completion {
[self showProgressOverlay];
NSString *outputFileType = AVFileTypeMPEG4;
__block NSString *videoID = nil;
if (self.videoAttachment == nil) {
[MagicalRecord saveUsingEditContextWithBlockAndWait:^(NSManagedObjectContext *localContext) {
self.videoAttachment = [SPXAttachment MR_createEntityInContext:localContext];
self.videoAttachment.uuid = [NSString uuid];
self.videoAttachment.clientCreatedAt = [NSDate date];
videoID = self.videoAttachment.uuid;
}];
} else {
videoID = self.videoAttachment.uuid;
}
self.videoAttachment = [SPXAttachment MR_findFirstByAttribute:@"uuid" withValue:videoID];
NSString *targetPath = self.videoAttachment.filePath;
DDLogVerbose(@"Exporting Video …Run Code Online (Sandbox Code Playgroud) 我在前置Camera中遇到使用AVAssetExportSession导出的视频方向错误.我按照这个教程/sf/answers/2475805461/但我得到了这个场景.我认为将图像切成两半并没有错.我尝试更改视频图层,渲染图层但没有运气.我的代码看起来像这样.
let composition = AVMutableComposition()
let vidAsset = AVURLAsset(url: path)
// get video track
let vtrack = vidAsset.tracks(withMediaType: AVMediaTypeVideo)
// get audi trac
let videoTrack:AVAssetTrack = vtrack[0]
_ = videoTrack.timeRange.duration
let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)
var _: NSError?
let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
do {
try compositionvideoTrack.insertTimeRange(vid_timerange, of: videoTrack, at: kCMTimeZero)
} catch let error {
print(error.localizedDescription)
}
let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = vidAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, vidAsset.duration), of: …Run Code Online (Sandbox Code Playgroud) 我正在尝试从UIImagePickerController创建的源视频中导出.mov文件.问题是输出文件AVAssetExportSession创建的只有668个字节.它为什么失败?我的代码:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *imagePickerVideoURL = [info objectForKey:UIImagePickerControllerMediaURL];
NSString *filename = @"vid1.mov";
AVAsset *video = [AVAsset assetWithURL:imagePickerVideoURL];
AVAssetExportSession *exportSession
= [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetMediumQuality];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.outputURL = [pathToSavedVideosDirectory URLByAppendingPathComponent:filename];
NSLog(@"processing video...: %@", exportSession);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(@"done processing video!");
}];
}
Run Code Online (Sandbox Code Playgroud) 我正在尝试在 Xamarin / Monotouch 中进行一些基本的视频合成,并取得了一些成功,但我陷入了看似相当简单的任务。
我以纵向方式录制相机中的视频,因此我使用 AVAssetExportSession 来旋转视频。我创建了一个图层指令来旋转视频,效果很好。我能够以正确的方向成功导出视频。
问题:
当我将音轨添加到导出中时,我总是收到失败响应并显示以下错误:
Domain=AVFoundationErrorDomain Code=-11841“操作已停止”UserInfo=0x1912c320 {NSLocalizedDescription=操作已停止,NSLocalizedFailureReason=无法合成视频。}
如果我没有在 exportSession 上设置 videoComposition 属性,则音频和视频导出完全正常,只是方向错误。如果有人能给我一些建议,我将不胜感激。下面是我的代码:
var composition = new AVMutableComposition();
var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
var index = 0;
var renderSize = new SizeF(480, 480);
var _startTime = CMTime.Zero;
//AVUrlAsset asset;
var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
//var asset = AVAsset.FromUrl(new NSUrl(file, false));
//create an avassetrack with our asset
var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
var …Run Code Online (Sandbox Code Playgroud) 有人知道如何使用具有 2 个以上通道的 AVAssetWriterInput init 吗?
我正在尝试初始化一个音频输入,以这种方式在 AVAssetWriter 之后添加它:
let audioInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: audioOutputSettings)
Run Code Online (Sandbox Code Playgroud)
assetWriter.add(audioInput)
assetWriter.startWriting()
但是当我使用包含大于 2 的通道键数的 audioOutputSettings 字典初始化 audioInput 时它会崩溃。错误是:
Terminating app due to uncaught exception ‘NSInvalidArgumentException’, reason: ’*** -[AVAssetWriterInput initWithMediaType:outputSettings:sourceFormatHint:] 6 is not a valid channel count for Format ID ‘aac ’. Use kAudioFormatProperty_AvailableEncodeNumberChannels (<AudioToolbox/AudioFormat.h>) to enumerate available channel counts for a given format.
我正在使用 PryntTrimmerView 来修剪视频文件。有我的代码用于导出修剪视频文件并生成视频缩略图:
func prepareAssetComposition() throws {
guard let asset = trimmerView.asset, let videoTrack = asset.tracks(withMediaType: AVMediaTypeVideo).last else {
return
}
let size = videoTrack.naturalSize.applying(videoTrack.preferredTransform)
print(CGSize(width: fabs(size.width), height: fabs(size.height)))
let assetComposition = AVMutableComposition()
let start = trimmerView.startTime?.seconds
let end = trimmerView.endTime?.seconds
let startTime = CMTime(seconds: Double(start ?? 0), preferredTimescale: 1000)
let endTime = CMTime(seconds: Double(end ?? 0), preferredTimescale: 1000)
let trackTimeRange = CMTimeRange(start: startTime, end: endTime)
let compositionTrack = assetComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
try compositionTrack.insertTimeRange(trackTimeRange, of: videoTrack, at: kCMTimeZero)
var …Run Code Online (Sandbox Code Playgroud) 我正在使用AVAssetExportSession在 iOS 应用程序中导出视频。为了以正确的方向呈现视频,我使用AVAssetTrack的是preferredTransform. 对于一些源视频,这个属性似乎有一个错误的值,结果视频出现偏移或全黑。我怎样才能解决这个问题?
ios ×10
avfoundation ×5
objective-c ×3
swift ×2
avasset ×1
iphone ×1
m4a ×1
nsdocument ×1
trim ×1
video ×1
xamarin.ios ×1