我有一个AVCaptureSession与AVCaptureVideoPreviewLayer一起运行.
我可以看到视频,所以我知道它正在工作.
但是,我想要一个集合视图,并在每个单元格中添加一个预览图层,以便每个单元格显示视频的预览.
如果我尝试将预览图层传递到单元格并将其添加为子图层,则会从其他单元格中删除该图层,因此它一次只能显示在一个单元格中.
还有另一种(更好的)方法吗?
我正试图从相机实时显示UIImage,看来我的UIImageView没有正确显示图像.这是AVCaptureVideoDataOutputSampleBufferDelegate必须实现的方法
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
// NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
// NSLog(@"The image view is %@", imageView);
// UIImage *theImage = [[UIImage alloc] initWithData:[NSData
// dataWithContentsOfURL:[NSURL
// URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
[self.session stopRunning];
[imageView setImage: theImage];
}
Run Code Online (Sandbox Code Playgroud)
为了解决容易出错的问题:
setImage:theImage到imageView,则图像被正确加载(并且第二次调用NSLog报告非零对象).imageFromSampleBuffer:很好,因为NSLog报告的大小为360x480,这是我预期的大小.我正在使用的代码是最近发布的AVFoundation苹果公司提供片段在这里.
特别是,这是我用来设置AVCaptureSession对象和朋友的代码(我很少理解),并从Core Video缓冲区创建UIImage对象(这就是imageFromSampleBuffer方法).
最后,我可以让应用程序崩溃,如果我尝试发送drawInRect:到一个普通的UIView子类与UIImage …
嘿,我试图使用AVCaptureSession iphone相机访问原始数据.我按照Apple(提供的指南链接点击这里).
从samplebuffer的原始数据是在YUV格式(我是正确这里有关原始视频帧格式??),如何直接获得为Y分量的数据从存储在samplebuffer原始数据.
我有一个应用程序捕获kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式的实时视频来处理Y通道.根据Apple的文档:
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange双平面分量Y'CbCr 8位4:2:0,全范围(亮度= [0,255]色度= [1,255]).baseAddr指向big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar结构.
我想在UIViewController中展示一些这些帧,是否有任何API可以转换为kCVPixelFormatType_32BGRA格式?您能给出一些提示来调整Apple提供的这种方法吗?
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); …Run Code Online (Sandbox Code Playgroud) 我收到这样的错误在我的控制台中:
:CGBitmapContextCreate:无效数据字节/行:对于8个整数位/组件,3个组件,kCGImageAlphaPremultipliedFirst,至少应为1920.:CGBitmapContextCreateImage:无效的上下文0x0
我使用下面的代码:
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer); …Run Code Online (Sandbox Code Playgroud)