我的应用程序目前正在使用AVFoundation从iPhone后置摄像头获取原始摄像头数据,并实时显示在AVCaptureVideoPreviewLayer上.
我的目标是有条不紊地将简单的图像过滤器应用于预览图层.图像未保存,因此我不需要捕获输出.例如,我想切换一个设置,将预览图层中的视频转换为Black&White.
我在这里发现了一个问题似乎通过在缓冲区中捕获单个视频帧,应用所需的转换,然后将每个帧显示为UIImage来实现类似的功能.由于多种原因,这似乎对我的项目来说太过分了,我想避免这可能导致的任何性能问题.
这是实现目标的唯一途径吗?
正如我所提到的,我不打算捕获任何AVCaptureSession的视频,只是预览它.
我的主要目标是从服务器流式传输视频,并在流式传输时逐帧剪切(以便OpenGL可以使用它).为此,我使用了我在互联网上随处可见的代码(我记得它来自Apple的GLVideoFrame示例代码):
NSArray * tracks = [asset tracks];
NSLog(@"%d", tracks.count);
for(AVAssetTrack* track in tracks) {
NSLog(@"type: %@", [track mediaType]);
initialFPS = track.nominalFrameRate;
width = (GLuint)track.naturalSize.width;
height = (GLuint)track.naturalSize.height;
NSError * error = nil;
// _movieReader is a member variable
@try {
self._movieReader = [[[AVAssetReader alloc] initWithAsset:asset error:&error] autorelease];
}
@catch (NSException *exception) {
NSLog(@"%@ -- %@", [exception name], [exception reason]);
NSLog(@"skipping track");
continue;
}
if (error)
{
NSLog(@"CODE:%d\nDOMAIN:%@\nDESCRIPTION:%@\nFAILURE_REASON:%@", [error code], [error domain], error.localizedDescription, [error localizedFailureReason]);
continue;
}
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; …Run Code Online (Sandbox Code Playgroud) 我创建了一个空白项目(iOS)并将其放在我的viewDidLoad中:
NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"Movie" ofType:@"m4v"];
MPMoviePlayerViewController *playerController = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviePath]];
[self presentMoviePlayerViewControllerAnimated:playerController];
[playerController.moviePlayer play];
Run Code Online (Sandbox Code Playgroud)
当应用程序启动时,我得到的是一个白色屏幕,其中包含日志中的错误消息:
<Error>: CGContextSaveGState: invalid context 0x0
<Error>: CGContextClipToRect: invalid context 0x0
<Error>: CGContextTranslateCTM: invalid context 0x0
<Error>: CGContextDrawShading: invalid context 0x0
<Error>: CGContextRestoreGState: invalid context 0x0
Warning: Attempt to present <MPMoviePlayerViewController: 0x821e3b0> on <ViewController: 0x863aa40> whose view is not in the window hierarchy!
Run Code Online (Sandbox Code Playgroud)
......以及一系列关于禁用自动播放的内容.我特别不理解视图不属于层次结构的一行,因为它是一个空白的"单视图应用程序"iOS项目,代码在ViewController.m中.它位于视图层次结构中.
我知道电影文件本身不是问题,因为我是从Apple的MPMoviePlayer上的示例代码中得到的.尽管我(看似)尝试了样本中所写的所有内容,但我还是无法让播放器工作.
这是另一个尝试,这次使用MPMoviePlayerController(不是MPMoviePlayerViewController):
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:url];
[player setContentURL:url];
[player setMovieSourceType:MPMovieSourceTypeFile];
[[player view] setFrame:self.view.bounds]; …Run Code Online (Sandbox Code Playgroud) 是否可以在我自己的应用程序中使用现有的Apple系统声音?我想在Swift中编写一个示例应用程序,它执行以下步骤:
/System/Library/Audio/UISounds/)所以它基本相同,就像你在iPhone上选择一个新的铃声一样.
我认为有些应用程序正在使用这种声音,还是他们复制/购买了它?
谢谢并问候Jens
我想将视频从iPhone相机流式传输到Mac上运行的应用程序.想想视频聊天,但只有一种方式,从设备到接收器应用程序(它不是视频聊天).
到目前为止我的基本理解:
我对上述情况是对的还是我已经偏离了轨道?
Apple Tech Q&A 1702提供了一些关于将单个帧保存为图像的信息 - 这是最好的解决方法吗?只需保存30fps,然后像ffmpeg那样压缩它们?
有很多关于iPhone直播的讨论,但关于发送实时视频的人的信息要少得多.我希望有一些广泛的笔触让我指向正确的方向.
我正在尝试使用AVFoundation将多个视频剪辑组合成一个.我可以使用下面的代码使用AVMutableComposition创建单个视频
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime startTime = kCMTimeZero;
/*videoClipPaths is a array of paths of the video clips recorded*/
//for loop to combine clips into a single video
for (NSInteger i=0; i < [videoClipPaths count]; i++) {
NSString *path = (NSString*)[videoClipPaths objectAtIndex:i];
NSURL *url = [[NSURL alloc] initFileURLWithPath:path];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
[url release];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioTrack = [[asset …Run Code Online (Sandbox Code Playgroud) 我正在关注http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html上的参考资料,以从iPhone摄像头捕获视频.除了该页面的代码之外,这是一个新项目.我也将AVFoundation框架添加到项目中.这是我得到的链接器错误:
Build my project of project my project with configuration Debug
CompileC "build/my project.build/Debug-iphoneos/my project.build/Objects-normal/armv6/MainViewController.o" /Users/mwilliamson/Projects/my_project/iphone/Classes/MainViewController.m normal armv6 objective-c com.apple.compilers.gcc.4_2
cd /Users/mwilliamson/Projects/my_project/iphone
setenv LANG en_US.US-ASCII
setenv PATH "/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin:/Developer/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin:/opt/local/bin:/usr/local/git/bin"
/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc-4.2 -x objective-c -arch armv6 -fmessage-length=0 -pipe -std=c99 -Wno-trigraphs -fpascal-strings -O0 -Wreturn-type -Wunused-variable -isysroot /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.0.sdk -fvisibility=hidden -gdwarf-2 -mthumb -miphoneos-version-min=4.0 -iquote "/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/my project-generated-files.hmap" "-I/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/my project-own-target-headers.hmap" "-I/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/my project-all-target-headers.hmap" -iquote "/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/my project-project-headers.hmap" -F/Users/mwilliamson/Projects/my_project/iphone/build/Debug-iphoneos -I/Users/mwilliamson/Projects/my_project/iphone/build/Debug-iphoneos/include -I/Users/mwilliamson/Projects/my_project/iphone/opencv_device/include "-I/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/DerivedSources/armv6" "-I/Users/mwilliamson/Projects/my_project/iphone/build/my project.build/Debug-iphoneos/my project.build/DerivedSources" -include /var/folders/kW/kW6u6B7SGyGYu+nNumtIa++++TI/-Caches-/com.apple.Xcode.501/SharedPrecompiledHeaders/my_project_Prefix-alujyqxskcuyuogdsynmjyrkxbhh/my_project_Prefix.pch -c /Users/mwilliamson/Projects/my_project/iphone/Classes/MainViewController.m …Run Code Online (Sandbox Code Playgroud) 我有一个AVCaptureSession与AVCaptureVideoPreviewLayer一起运行.
我可以看到视频,所以我知道它正在工作.
但是,我想要一个集合视图,并在每个单元格中添加一个预览图层,以便每个单元格显示视频的预览.
如果我尝试将预览图层传递到单元格并将其添加为子图层,则会从其他单元格中删除该图层,因此它一次只能显示在一个单元格中.
还有另一种(更好的)方法吗?
我的应用程序中有以下代码:
NSURL *url = [NSURL fileURLWithPath: [self.DocDir stringByAppendingPathComponent: self.FileName] isDirectory: NO];
self.avPlayer = [AVPlayer playerWithURL: url];
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
Run Code Online (Sandbox Code Playgroud)
这适用于iOS 6,但由于某种原因,iOS 7会返回NaN.检查self.avPlayer.currentItem.duration时,CMTime对象的值为0,标志为17.
有趣的是,玩家工作正常,只是持续时间是错误的.
还有其他人遇到过同样的问题吗?我正在导入以下内容:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVAsset.h>
Run Code Online (Sandbox Code Playgroud) Obj C项目; 刚更新到Xcode 8和iOS/10.应用似乎工作正常,然而,收到警告 -
"缺少子模块'AVFoundation.AVSpeechSynthesis'""缺少子模块'AVFoundation.AVAudioSession'"
这些消息出现在AVAudioSession和AVSpeechSynthesis的#import语句中.
有谁知道这是怎么回事?
TIA