Car*_*ger 23 iphone avfoundation uiimageview uiimage ios4
我正试图从相机实时显示UIImage,看来我的UIImageView没有正确显示图像.这是AVCaptureVideoDataOutputSampleBufferDelegate必须实现的方法
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
// NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
// NSLog(@"The image view is %@", imageView);
// UIImage *theImage = [[UIImage alloc] initWithData:[NSData
// dataWithContentsOfURL:[NSURL
// URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
[self.session stopRunning];
[imageView setImage: theImage];
}
Run Code Online (Sandbox Code Playgroud)
为了解决容易出错的问题:
setImage:theImage到imageView,则图像被正确加载(并且第二次调用NSLog报告非零对象).imageFromSampleBuffer:很好,因为NSLog报告的大小为360x480,这是我预期的大小.我正在使用的代码是最近发布的AVFoundation苹果公司提供片段在这里.
特别是,这是我用来设置AVCaptureSession对象和朋友的代码(我很少理解),并从Core Video缓冲区创建UIImage对象(这就是imageFromSampleBuffer方法).
最后,我可以让应用程序崩溃,如果我尝试发送drawInRect:到一个普通的UIView子类与UIImage由归国imageFromSamplerBuffer,虽然它不会崩溃,如果我使用UIImage如上从URL.这是崩溃内部调试器的堆栈跟踪(我得到一个EXC_BAD_ACCESS信号):
#0 0x34a977ee in decode_swap ()
#1 0x34a8f80e in decode_data ()
#2 0x34a8f674 in img_decode_read ()
#3 0x34a8a76e in img_interpolate_read ()
#4 0x34a63b46 in img_data_lock ()
#5 0x34a62302 in CGSImageDataLock ()
#6 0x351ab812 in ripc_AcquireImage ()
#7 0x351a8f28 in ripc_DrawImage ()
#8 0x34a620f6 in CGContextDelegateDrawImage ()
#9 0x34a61fb4 in CGContextDrawImage ()
#10 0x321fd0d0 in -[UIImage drawInRect:blendMode:alpha:] ()
#11 0x321fcc38 in -[UIImage drawInRect:] ()
Run Code Online (Sandbox Code Playgroud)
编辑:这里有一些关于由该位代码返回的UIImage的更多信息.
使用这里描述的方法,我可以得到像素并打印它们,乍一看它们看起来很好(例如,alpha通道中的每个值都是255).但是,缓冲区大小略有偏差.我从Flickr从该URL获得的图像是375x500,它[pixelData length]给了我750000 = 375*500*4,这是预期值.但是,返回的图像的像素数据imageFromSampleBuffer:具有大小691208 = 360*480*4 + 8,因此像素数据中有8个额外的字节.CVPixelBufferGetDataSize本身返回这个8分之一的值.我想了一下,它可能是由于在内存中的对齐位置分配缓冲区,但691200是256的倍数,所以这也不能解释它.这个尺寸差异是我在两个UIImages之间可以区分的唯一区别,它可能会造成麻烦.但是,没有理由为缓冲区分配额外的内存会导致EXC_BAD_ACCESS违规.
非常感谢您的帮助,如果您需要更多信息,请告诉我们.
Ken*_*zer 40
我遇到了同样的问题...但我找到了这个老帖子,它创建CGImageRef的方法有效!
http://forum.unity3d.com/viewtopic.php?p=300819
这是一个工作样本:
app has a member UIImage theImage;
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//... just an example of how to get an image out of this ...
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0); // Lock the image buffer
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); // Get information of the image
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
/* CVBufferRelease(imageBuffer); */ // do not call this!
return newImage;
}
Run Code Online (Sandbox Code Playgroud)
mat*_*att 10
Apple的技术问答QA1702现在可以很好地解释视频帧的实时捕捉:
https://developer.apple.com/library/ios/#qa/qa1702/_index.html
设置正确的输出格式也很重要.使用默认格式设置时,我遇到了图像捕获问题.它应该是:
[videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]];
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
24728 次 |
| 最近记录: |