Ast*_*hpi 27 video-capture video-processing avfoundation core-video ios
使用AVFoundation框架工作捕获视频.借助Apple文档http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/ UID/TP40010188-CH5-SW2
现在我做了以下事情
1.Created videoCaptureDevice
2.Created AVCaptureDeviceInput并设置videoCaptureDevice
3.Created AVCaptureVideoDataOutput和实现代表
4.Created AVCaptureSession-设定输入作为AVCaptureDeviceInput并设置输出作为AVCaptureVideoDataOutput
5.在AVCaptureVideoDataOutput委托方法中
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Run Code Online (Sandbox Code Playgroud)
我有CMSamplebuffer并转换成UIImage并测试使用打印UIImageview
[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
Run Code Online (Sandbox Code Playgroud)
每件事情都很顺利........
我的问题是,我需要通过UDP套接字发送视频帧.尽管下面的一个是我试过的坏主意,UIImage到NSData和通过UDP Pocket发送.BUt得到了如此延迟的视频处理.由于UIImage NSDate导致了很多问题
所以请给我解决方案我的问题
1)任何方法将CMSampleBUffer或CVImageBuffer转换为NSData ??
2)像音频队列服务和视频队列存储UIImage和UIImage到NSDate和发送???
如果我骑在错误的算法背后请在写方向路径我
提前致谢
Ste*_*lin 35
这是获取缓冲区的代码.此代码假定为平面图像(例如BGRA).
NSData* imageToBuffer( CMSampleBufferRef source) {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);
NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return [data autorelease];
}
Run Code Online (Sandbox Code Playgroud)
更有效的方法是使用NSMutableData或缓冲池.
每秒发送480x360图像需要4.1Mbps连接,假设有3个颜色通道.
| 归档时间: |
|
| 查看次数: |
24884 次 |
| 最近记录: |