我正在屏幕区域进行一些运动检测.在开始检测之前,我想设置焦距和曝光并锁定它们,这样它们就不会触发假动作.因此,我将AVCaptureFocusModeAutoFocus和AVCaptureExposureModeAutoExpose发送到设备并添加KeyvalueObserver.当观察者说它已完成聚焦并改变曝光时,它会锁定它们(并开始运动检测).一切都可以正常工作,但锁定曝光会在几秒钟内崩溃应用程序",尽管在两种情况下都有相同的代码.
static void * const MyAdjustingFocusObservationContext = (void*)&MyAdjustingFocusObservationContext;
static void * const MyAdjustingExposureObservationContext = (void*)&MyAdjustingExposureObservationContext;
-(void)focusAtPoint{
CGPoint point;
if(fromRight) point.x = 450.0/480.0;
else point.x = 30.0/480.0;
point.y = 245.0/320.0;
AVCaptureDevice *device =[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(device != nil) {
NSError *error;
if([device lockForConfiguration:&error]){
if([device isExposureModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device isFocusPointOfInterestSupported]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:MyAdjustingFocusObservationContext];
NSLog(@"focus now");
}
if([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure] && [device isExposurePointOfInterestSupported]) {
[device setExposurePointOfInterest:point];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:MyAdjustingExposureObservationContext];
NSLog(@"expose now");
}
[device unlockForConfiguration];
}else{ …Run Code Online (Sandbox Code Playgroud) 我正在使用AVAssetWriter来保存来自摄像头的实时信息.这很适合使用此代码
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if(videoWriter.status != AVAssetWriterStatusWriting){
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:lastSampleTime];
}
if(adaptor.assetWriterInput.readyForMoreMediaData) [adaptor appendPixelBuffer:imageBuffer withPresentationTime:lastSampleTime];
else NSLog(@"adaptor not ready",);
}
Run Code Online (Sandbox Code Playgroud)
我通常接近30 fps(但是其他人注意到iPhone 4s上的速度不是60 fps),而当定时[适配器appendPixelBuffer]它只需要几毫秒.
但是,我不需要全帧,但我需要高质量(低压缩,每帧关键帧),我将在几次后读回一个过程.因此,我想在写作之前裁剪图像.幸运的是我只需要一个中间的条带,所以我可以做一个简单的缓冲区memcpy.为此,我创建了一个CVPixelBufferRef,我正在使用适配器进行复制和写入:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if(videoWriter.status != AVAssetWriterStatusWriting){
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:lastSampleTime];
}
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void * buffIn = CVPixelBufferGetBaseAddress(imageBuffer);
CVPixelBufferRef pxbuffer = …Run Code Online (Sandbox Code Playgroud)