在高层次上,我创建了一个应用程序,让用户可以指向他或她的iPhone相机并查看已经过视觉效果处理过的视频帧.此外,用户可以点击按钮将当前预览的定格作为保存在其iPhone库中的高分辨率照片.
为此,该应用程序遵循以下过程:
1)创建AVCaptureSession
captureSession = [[AVCaptureSession alloc] init];
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
Run Code Online (Sandbox Code Playgroud)
2)使用后置摄像头连接AVCaptureDeviceInput.
videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:backFacingCamera error:&error] autorelease];
[captureSession addInput:videoInput];
Run Code Online (Sandbox Code Playgroud)
3)将AVCaptureStillImageOutput连接到会话,以便能够以Photo分辨率捕获静止帧.
stillOutput = [[AVCaptureStillImageOutput alloc] init];
[stillOutput setOutputSettings:[NSDictionary
dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[captureSession addOutput:stillOutput];
Run Code Online (Sandbox Code Playgroud)
4)将AVCaptureVideoDataOutput连接到会话,以便能够以较低分辨率捕获单个视频帧(CVImageBuffers)
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[captureSession addOutput:videoOutput];
Run Code Online (Sandbox Code Playgroud)
5)在捕获视频帧时,将每个新帧作为CVImageBuffer调用委托的方法:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
[self.delegate processNewCameraFrame:pixelBuffer];
}
Run Code Online (Sandbox Code Playgroud)
6)然后委托处理/绘制它们:
- (void)processNewCameraFrame:(CVImageBufferRef)cameraFrame {
CVPixelBufferLockBaseAddress(cameraFrame, 0);
int bufferHeight = CVPixelBufferGetHeight(cameraFrame);
int bufferWidth = CVPixelBufferGetWidth(cameraFrame);
glClear(GL_COLOR_BUFFER_BIT);
glGenTextures(1, …Run Code Online (Sandbox Code Playgroud)