我正试图在iPhone 5s上以最高分辨率(AVCaptureSessionPresetPhoto)拍摄多张照片.我尝试使用以下代码:
dispatch_semaphore_t sync = dispatch_semaphore_create(0);
while( [self isBurstModeEnabled] == YES )
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (imageSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
NSString *videoThumbPath = [NSString
stringWithFormat:@"%@/img%d.png",
burstFolderPath,
index];
[imageData writeToFile:videoThumbPath atomically:YES];
if( 0 == index )
{
[self NSLogPrint:[NSString stringWithFormat:@"Created photo at %@",videoThumbPath]];
}
}
dispatch_semaphore_signal(sync);
}];
dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER);
}
Run Code Online (Sandbox Code Playgroud)
使用此代码,我每秒可以获得大约2张照片,无法接近原生相机应用程序的突发模式的性能.我究竟做错了什么?此外,我尝试使用上面的代码没有信号量,但在这种情况下,我有奇怪的行为,一些照片丢失(img0.png img1.png img3.png将存在,但img2.png将丢失).使用第二种方法,性能会更好,但仍然不能与原生应用程序性能相媲美(在我的测试中,相机应用程序每秒会产生大约8.4张照片).
当我使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput录制音频+视频时,我遇到了延迟问题.有时视频会阻塞几毫秒,有时音频与视频不同步.
我插入了一些日志并观察到我首先在captureOutput回调中获得了大量视频缓冲区,并且在一段时间后我得到了音频缓冲区(有时我根本没有收到音频缓冲区,结果视频没有声音).如果我评论处理视频缓冲区的代码,我会毫无问题地获得音频缓冲区.
这是我正在使用的代码:
-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal
{
AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
self._videoOutput = dataOutput;
[dataOutput release];
self._videoOutput.alwaysDiscardsLateVideoFrames = NO;
self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
forKey:(id)kCVPixelBufferPixelFormatTypeKey
];
AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init];
self._audioOutput = audioOutput;
[audioOutput release];
[captureSessionLocal addOutput:self._videoOutput];
[captureSessionLocal addOutput:self._audioOutput];
// Setup the queue
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[self._videoOutput setSampleBufferDelegate:self queue:queue];
[self._audioOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
}
Run Code Online (Sandbox Code Playgroud)
在这里我设置了作者:
-(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal
{
NSError *error = nil;
self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(self._videoWriter);
// …
Run Code Online (Sandbox Code Playgroud) 我正在使用AVAssetWriter
和captureOutput
回调来录制视频和音频。问题在于,在大约5%的情况下,我无法从第二个零开始生成缩略图(如果我尝试在视频中走得更远,则没有问题)。我得到的错误是:
错误域= AVFoundationErrorDomain代码= -11832“无法打开” UserInfo = 0x4b31b0 {NSLocalizedFailureReason =无法使用该介质。NSUnderlyingError = 0x4effb0“操作无法完成。(OSStatus错误-12431。)” }
这是我正在使用的代码:
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error)
{
if (result != AVAssetImageGeneratorSucceeded)
{
[self NSLogPrint:[NSString stringWithFormat:@"couldn't generate thumbnail, error:%@", error]];
}
UIImage *thumbImg=[[UIImage imageWithCGImage:im] retain];
[image setImage:thumbImg];
[thumbImg release];
[generator release];
};
CGSize maxSize = CGSizeMake(177, 100);
generator.maximumSize = …
Run Code Online (Sandbox Code Playgroud) 我的代码播放视频文件有问题.每当我以全屏模式播放文件时,播放都不会占用我的所有屏幕.这是相关代码:
NSURL *url = [NSURL fileURLWithPath:@"Somefile.mov"];
moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:url];
[moviePlayer setControlStyle:MPMovieControlStyleFullscreen];
[moviePlayer setFullscreen:YES];
moviePlayer.view.frame = self.switchView.frame;
[self.switchView addSubview:moviePlayer.view];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:@selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:moviePlayer];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:@selector(playbackStateDidChange:)
name:MPMoviePlayerPlaybackStateDidChangeNotification
object:moviePlayer];
[moviePlayer prepareToPlay];
[moviePlayer play];
Run Code Online (Sandbox Code Playgroud)
这是我得到的输出: